0.5.1 • Published 1 year ago

ts-chatgpt v0.5.1

Weekly downloads
-
License
MIT
Repository
github
Last release
1 year ago

ts-chatgpt

npm version main workflow License: MIT

A library that is created to receive pure responses that are typed using the official ChatGPT API by OpenAI.

Install

npm install ts-chatgpt

It has been confirmed to work with Remix's loader function.

Usage

FunctionDescriptionParametersReturn
promptGet a response from ChatGPT APIprops - The props contains an model name you want to use for the ChatGPT API, an array of Message type and an options object.Promise<ChatGPTResponse>

When calling this function, be sure to set the OPENAI_API_KEY environment variable to the API key you received from OpenAI.

import { prompt } from "ts-chatgpt";

const response = await prompt({
  model: "gpt-4",
  messages: [
    {
      role: "user",
      content:
        "In the style of Nicholas Sparks, please summarize the following introductory You are limited to 140 characters. 'I love Android and I develop applications using Kotlin and Jetpack Compose.'",
    },
  ],
  options: {
    temperature: 0.1,
  },
});

Since dotenv.config() is automatically called internally, developers do not need to install dotenv to load OPENAI_API_KEY themselves.

Props

When calling prompt(), you must pass an object containing the following as an argument:

KeyDescriptionTypeRequired
modelThe model name you want to use for the ChatGPT API.string
messagesAn array of Message type.Message[]
optionsAn object containing options.PromptOptions

The following values are currently available for the model More will be added in the future.

ModelDescriptionAvailable
gpt-3.5-turbo-0301The default model.
gpt-3.5-turbo-
gpt-4GPT-4 is the latest and most powerful model.

The following values can be specified by the user as messages to be passed to the prompt function.

KeyDescriptionTypeRequired
roleThe role of the message."system", "assistant" or "user"
contentThe content of the message.string

The following values can be specified by the user as options to be passed to the prompt function.

KeyDescriptionType
apiKeyAPI key that can be obtained from the OpenAI configuration page. You can omit this value by setting the OPENAI_API_KEY environment variable.string
temperatureThe lower the temperature, the more accurate the results. API temperatures set to 0 or close to 0 (e.g. 0.1 or 0.2) tend to give better results in most cases; with GPT-3, the higher the temperature, the more creative and random the results, while with Codex, the higher the temperature, the more truly random and erratic the response can be.number

For detailed specifications of the ChatGPT API, please refer to this document.

Response Type

There are two types of return values for the prompt function: ChatGPT and ChatGPTError.

TypeDescription
ChatGPTThe response from the ChatGPT API.
ChatGPTErrorThe response from the ChatGPT API when an error occurs.

ChatGPT by type is as follows:

type ChatGPT = {
  choices?:
    | {
        message: {
          role: string;
          content: string;
        };
        finish_reason: string;
        index: number;
      }[]
    | undefined;
  object: string;
  id: string;
  created: number;
  model: string;
  usage: {
    prompt_tokens: number;
    completion_tokens: number;
    total_tokens: number;
  };
};

Next, ChatGPTError as a type is as follows:

type ChatGPTError = {
  error: {
    message: string;
    type: string;
    param: string | null;
    code: string | null;
  };
};

Team

npm.io
Keisuke Takagi

License

This project is licensed under the terms of the MIT license.

MIT