0.3.1 • Published 3 months ago

typechat-cli v0.3.1

Weekly downloads
-
License
MIT
Repository
-
Last release
3 months ago

TypeChat for the command line, on OpenAI and Ollama

This is a utility application that wraps the TypeChat library and puts it in a command-line convenient for doing various repetitive AI-based tasks. Here's an example:

$ npx typechat-cli -s ./example/sentiment-schema.ts "This is cool and you are cool too"
{
  "sentiment": "positive"
}

The output of the command is a JSON object that conforms to the schema you give. Here's the example schema above, sentiment-schema.ts:

export interface ResponseShape {
  sentiment: 'positive' | 'negative' | 'neutral' // The sentiment of the input text, with positive, negative, and neutral as the only options
}

Note that the comments are very important! They are effectively your Prompt to OpenAI and it will be used to effectively process the input and generate the output. The schema is also used to validate the output of the OpenAI API. By default, typechat-cli will look for the root interface ResponseShape, though you can override this with the -t parameter.

Here's another example for extracting links found in a Markdown document:

// An extracted http or https link from the supplied input. If text does not have a link, it should be ignored.
export interface LinkInformation {
  url: string // The URL of the link. Ignore lines that do not have a link. Links must start with http:// or https://
  description: string // The description of the link given in the text. If no description is given, try to infer one from the URL
  category: string // The general category of the link and description, given as a single word
}

export interface ResponseShape {
  links: LinkInformation[]
}

You might use it like this:

## Extract all of the links from our docs
$ npx typechat-cli -s ./example/links-schema.ts ./docs/**/*.md

typechat-cli will dump a single JSON array out containing all of the results, in the same order as the files given.

How do I choose OpenAI or Ollama?

typechat-cli works with both OpenAI as well as Ollama. Setting the appropriate environment variable will choose which one to use:

  • OLLAMA_ENDPOINT - the base URL to use for Ollama, http://localhost:11434 is the usual one that Ollama runs on
  • OPENAI_API_KEY - the API key to use for OpenAI

What model does this use?

By default, this uses gpt-4 for OpenAI and llama2 for Ollama, though you can override it with the -m parameter. For many tasks, gpt-3.5-turbo will work and is much cheaper if using OpenAI!

Getting more verbose information

Sometimes when operating with an entire directory of files, it can be convenient to get the input file alongside the data. If the --with-text flag is passed in, the data will be returned in the form:

interface ReturnedData {
  filename: string | null // The fully qualified path to the input file, or null if prompt text was directly given
  input: string // The contents of the input file or prompt text
  data: ResponseShape // The data returned
}

Piping data in from stdin

You can also pipe data in from stdin, which will be used as the input text. For example:

$ echo "Yes this is very cool" | npx ts-node ./src/index.ts -s ./example/sentiment-schema.ts

No input provided, reading from stdin...
{
  "sentiment": "positive"
}

How do I run this from the Git repo?

$ npx ts-node ./src/index.ts -s ./example/sentiment-schema.ts "This is cool and you are cool too"
0.3.0-beta.2

3 months ago

0.3.0

3 months ago

0.3.0-beta.1

3 months ago

0.3.1

3 months ago

0.2.0

4 months ago

0.1.2

5 months ago

0.1.1

5 months ago

0.1.0

5 months ago

0.0.1-beta.2

5 months ago

0.0.1-beta.1

5 months ago