0.12.0 • Published 4 months ago

litellm v0.12.0

Weekly downloads
-
License
ISC
Repository
github
Last release
4 months ago

Usage

npm install litellm
import { completion } from 'litellm';
process.env['OPENAI_API_KEY'] = 'your-openai-key';

const response = await completion({
  model: 'gpt-3.5-turbo',
  messages: [{ content: 'Hello, how are you?', role: 'user' }],
});

// or stream the results
const stream = await completion({
  model: "gpt-3.5-turbo",
  messages: [{ content: "Hello, how are you?", role: "user" }],
  stream: true
});

for await (const part of stream) {
  process.stdout.write(part.choices[0]?.delta?.content || "");
}

Features

We aim to support all features that LiteLLM python package supports.

  • Standardised completions
  • Standardised embeddings
  • Standardised input params 🚧 - List is here
  • Caching ❌
  • Proxy ❌

Supported Providers

ProviderCompletionStreamingEmbedding
openai
cohere
anthropic
ollama
ai21
replicate
deepinfra
mistral
huggingface
together_ai
openrouter
vertex_ai
palm
baseten
azure
sagemaker
bedrock
vllm
nlp_cloud
aleph alpha
petals

Development

Clone the repo

git clone https://github.com/zya/litellmjs.git

Install dependencies

npm install

Run unit tests

npm t

Run E2E tests

First copy the example env file.

cp .example.env .env

Then fill the variables with your API keys to be able to run the E2E tests.

OPENAI_API_KEY=<Your OpenAI API key>
....

Then run the command below to run the tests

npm run test:e2e
0.12.0

4 months ago

0.11.0

5 months ago

0.10.0

5 months ago

0.9.0

5 months ago

0.8.0

5 months ago

0.7.1

5 months ago

0.7.0

6 months ago

0.6.0

6 months ago

0.5.2

6 months ago

0.5.0

7 months ago

0.4.1

7 months ago

0.4.0

7 months ago

0.3.2

7 months ago

0.3.1

7 months ago

0.3.0

7 months ago

0.2.1

7 months ago

0.2.0

7 months ago

0.1.1

7 months ago

0.1.0

7 months ago

0.0.7

7 months ago

0.0.6

7 months ago

0.0.5

7 months ago

0.0.4

7 months ago

0.0.3

7 months ago

0.0.2

7 months ago

0.0.1

7 months ago