1.0.0 • Published 8 months ago

hyperbeeai v1.0.0

Weekly downloads
-
License
Apache-2.0
Repository
github
Last release
8 months ago

HyperbeeAI TypeScript and JavaScript API Library

NPM version npm bundle size

This library provides convenient access to the HyperbeeAI REST API from TypeScript or JavaScript. This is a fork of the OpenAI Node.js library, modified to work with the HyperbeeAI API.

Installation

npm install hyperbeeai

Usage

The full API of this library can be found in the api.md file along with code examples. The code below shows how to get started using the chat completions API.

import HyperbeeAI from 'hyperbeeai';

const client = new HyperbeeAI({
  apiKey: process.env['HYPERBEEAI_API_KEY'], // This is the default and can be omitted
});

async function main() {
  const chatCompletion = await client.chat.completions.create({
    messages: [{ role: 'user', content: 'Say this is a test' }],
    model: 'hyperbee-4o',
  });
}

main();

Streaming responses

We provide support for streaming responses using Server Sent Events (SSE).

import HyperbeeAI from 'hyperbeeai';

const client = new HyperbeeAI();

async function main() {
  const stream = await client.chat.completions.create({
    model: 'hyperbee-4o',
    messages: [{ role: 'user', content: 'Say this is a test' }],
    stream: true,
  });
  for await (const chunk of stream) {
    process.stdout.write(chunk.choices[0]?.delta?.content || '');
  }
}

main();

If you need to cancel a stream, you can break from the loop or call stream.controller.abort().

Chat Completion streaming helpers

This library also provides several conveniences for streaming chat completions, for example:

import HyperbeeAI from 'hyperbeeai';

const hyperbeeai = new HyperbeeAI();

async function main() {
  const stream = await hyperbeeai.beta.chat.completions.stream({
    model: 'hyperbee-4o',
    messages: [{ role: 'user', content: 'Say this is a test' }],
    stream: true,
  });

  stream.on('content', (delta, snapshot) => {
    process.stdout.write(delta);
  });

  // or, equivalently:
  for await (const chunk of stream) {
    process.stdout.write(chunk.choices[0]?.delta?.content || '');
  }

  const chatCompletion = await stream.finalChatCompletion();
  console.log(chatCompletion); // {id: "…", choices: […], …}
}

main();

See helpers.md for more details.

Additional Features

All other features from the original library are supported, including:

  • Request & Response types
  • File uploads
  • Error handling
  • Retries
  • Timeouts
  • Request IDs
  • Auto-pagination
  • Customizing the fetch client
  • And more

Please refer to the original documentation for more details on these features.

Acknowledgments

This library is a fork of the OpenAI Node.js library. We express our gratitude to OpenAI for creating and maintaining the original library.

License

This project maintains the same license as the original OpenAI Node.js library. All modifications are made in accordance with the terms of the original license.

Contributing

If you would like to contribute to this project, please submit an issue or pull request to our repository.