0.9.3 • Published 1 month ago
@smahtz/ollama-ai-provider v0.9.3
Ollama Provider for the Vercel AI SDK
The Ollama Provider for the Vercel AI SDK contains language model support for the Ollama APIs and embedding model support for the Ollama embeddings API.
Carrying on great work from Sergio Gómez at Ollama Provider
Requirements
This provider requires Ollama >= 0.9.0
Setup
The Ollama provider is available in the @smahtz/ollama-ai-provider module. You can install it with
npm i @smahtz/ollama-ai-provider
Provider Instance
You can import the default provider instance ollama
from @smahtz/ollama-ai-provider
:
import { ollama } from '@smahtz/ollama-ai-provider'
Or you can create your own instance with createOllama
to override default settings:
import { createOllama } from '@smahtz/ollama-ai-provider'
Example
import { ollama } from '@smahtz/ollama-ai-provider'
import { generateText } from 'ai'
const { text } = await generateText({
model: ollama('phi3'),
prompt: 'Write a vegetarian lasagna recipe for 4 people.',
})
Documentation
Please check out the Ollama provider documentation for more information.