@isdk/ai-tool-llm-local
llama.cpp LLM local Provider
llama.cpp LLM local Provider
llama.cpp LLM Provider - OpenAI Compatible
* Management of various prompts * Collection The LLM System template prompts With Enhancement (guessing corresponding system templates based on the model file name) * Multiple versions support under the same system prompt template file * Recommendation of
A lightweight MCP proxy server for designbot.deno.dev/chat
A library for creating visualizations of undirected and directed graphs with the help of AI.
Governance layer (Identity, RBAC, Credentials, Audit, Logging, Tracing) for Model Context Protocol (MCP) servers.
A modern, animated chat interface component for React applications with CopilotKit integration
Client for the qBraid AI Chat service.
Node.js client for the official ChatGPT API.
A Model Context Protocol server for integrating with any OpenAI SDK compatible Chat Completion API
A Model Context Protocol Server that runs as stdio, wrapping a Streamable HTTP server, so any MCP-supporting client can use Streamable HTTP MCP Servers
ai sdk compatible provider for azure(non OpenAI models)
qckfx AI Agent SDK
A PDF reader application with AI chat capabilities
koishi插件,支持openaiAPI格式的平台接入,如果你使用控制台将可以同时使用多个平台配置;支持分群组配置,为console界面深度优化,易于配置参数
Generic client agent for Tabby AI coding assistant IDE extensions.
MCP server for managing Tempo worklogs in Jira