1.0.13 • Published 1 year ago

openai-chat-stream v1.0.13

Weekly downloads
-
License
ISC
Repository
-
Last release
1 year ago

Usage

To use the OpenAIStream class, first install the package:

npm install openai-chat-stream

Then, import the OpenAIStream class and create an instance with an OpenAIStreamOptions object:

import { OpenAIStream, OpenAIStreamOptions } from "openai-chat-stream/index";

const options: OpenAIStreamOptions = {
  key: "<OpenAI API key>",
  model: "gpt-3.5-turbo",
  systemPrompt: "You're Jarvis! ",
};

const stream = new OpenAIStream(options);

Once you have an instance of the OpenAIStream class, you can use the stream method to send an array of Message objects and receive responses asynchronously:

const messages: Message[] = [
  { role: "user", content: "Hello!" },
  { role: "assistant", content: "Hi there, how can I help you today?" },
  { role: "user", content: "I'm having trouble with my computer." },
  { role: "assistant", content: "What seems to be the problem?" },
];
// Runs async 
stream.stream(messages, ({ chunk, error, done }) => { //Returns Promise<void>
  if (error) {
    console.log(`Error: ${error}`);
  } else if (chunk) {
    console.log(`Chunk: ${chunk}`);
  } else if (done) {
    console.log("Done!");
  }
});

The StreamCallback function takes an object with three properties: chunk (the returned message token string or null), error (a string or null), and done (a boolean). The chunk property represents a response from the OpenAI API, and the done property is true when all responses have been received. If there is an error, the error property will be a string describing the error.

1.0.13

1 year ago

1.0.12

1 year ago

1.0.11

1 year ago

1.0.10

1 year ago

1.0.9

1 year ago

1.0.8

1 year ago

1.0.7

1 year ago

1.0.6

1 year ago

1.0.5

1 year ago

1.0.4

1 year ago

1.0.3

1 year ago

1.0.2

1 year ago

1.0.1

1 year ago

1.0.0

1 year ago