1.0.1 • Published 1 year ago
@andrao/llm-client v1.0.1
🤖 @andrao/llm-client
This repo provides a single interface for interacting with LLMs from Anthropic, OpenAI, Together.ai, and, locally, Ollama.
Primary exports
Function | Description |
---|---|
runChatCompletion | Interoperable chat completion function |
Secondary exports
Function | Description |
---|---|
getAnthropicClient | Lazy-init an Anthropic SDK client |
getOllamaClient | Lazy-init an Ollama client via the OpenAI SDK |
getOpenAIClient | Lazy-init an OpenAI SDK client |
getTogetherClient | Lazy-init a Togeter.ai client via the OpenAI SDK |