1.0.13 • Published 1 year ago
openai-chat-stream v1.0.13
Usage
To use the OpenAIStream
class, first install the package:
npm install openai-chat-stream
Then, import the OpenAIStream class and create an instance with an OpenAIStreamOptions object:
import { OpenAIStream, OpenAIStreamOptions } from "openai-chat-stream/index";
const options: OpenAIStreamOptions = {
key: "<OpenAI API key>",
model: "gpt-3.5-turbo",
systemPrompt: "You're Jarvis! ",
};
const stream = new OpenAIStream(options);
Once you have an instance of the OpenAIStream
class, you can use the stream
method to send an array of Message
objects and receive responses asynchronously:
const messages: Message[] = [
{ role: "user", content: "Hello!" },
{ role: "assistant", content: "Hi there, how can I help you today?" },
{ role: "user", content: "I'm having trouble with my computer." },
{ role: "assistant", content: "What seems to be the problem?" },
];
// Runs async
stream.stream(messages, ({ chunk, error, done }) => { //Returns Promise<void>
if (error) {
console.log(`Error: ${error}`);
} else if (chunk) {
console.log(`Chunk: ${chunk}`);
} else if (done) {
console.log("Done!");
}
});
The StreamCallback
function takes an object with three properties: chunk
(the returned message token string or null), error
(a string or null), and done
(a boolean). The chunk property represents a response from the OpenAI API, and the done
property is true when all responses have been received. If there is an error
, the error property will be a string describing the error.