@jpbehler/partner
A friendly and digestible way to using OpenAI Assistants API
A friendly and digestible way to using OpenAI Assistants API
Gather background information to prompt LLMs
Universal AI Development Platform with external MCP server integration, multi-provider support, and professional CLI. Connect to 65+ MCP servers for filesystem, GitHub, database operations, and more. Build, test, and deploy AI applications with OpenAI, An
AI toolkit extracted from lighthouse with multi-provider support
LLM provider abstraction layer with unified streaming interface
Run codebase reviews across OpenAI, Claude, and Gemini models.
Node.js library for the OpenAI API
Client that can be integrated with a backend created with openai
Intelligent CLI toolkit that automates internationalization workflows with AI-powered translations for JavaScript/TypeScript projects
LLM will use browser api as tool
LLM will use browser api as tool
Simplify AI integration in web apps with local and offline model support
OpenAI JS SDK
Superfast runtime validators with only one line
``` npm install bing-sydney-ai ```
``` npm install bing-sydney-ai ```
``` npm install bing-sydney-ai ```
AiryLark的ModelContextProtocol(MCP)服务器,提供高精度翻译API
A Model Context Protocol (MCP) server and OpenAI function calling service for interacting with the CoinGecko Pro API
MCP server for fal.ai image generation