0.9.2 • Published 12 months ago
ollama-ai-provider-fix v0.9.2
Ollama Provider for the Vercel AI SDK
The Ollama Provider for the Vercel AI SDK contains language model support for the Ollama APIs and embedding model support for the Ollama embeddings API.
Setup
The Ollama provider is available in the ollama-ai-provider
module. You can install it with
npm i ollama-ai-provider
Provider Instance
You can import the default provider instance ollama
from ollama-ai-provider
:
import { ollama } from 'ollama-ai-provider';
Example
import { ollama } from 'ollama-ai-provider';
import { generateText } from 'ai';
const { text } = await generateText({
model: ollama('phi3'),
prompt: 'Write a vegetarian lasagna recipe for 4 people.',
});
Documentation
Please check out the Ollama provider documentation for more information.
0.9.2
12 months ago