@schornio/ollama-ai-provider-fork
Vercel AI Provider for running LLMs locally using Ollama
Vercel AI Provider for running LLMs locally using Ollama
Vercel AI Provider for providing memory to LLMs
CLI for Empire UI components - AI-ready React components