@aibrow/node-llama-cpp
Run AI models locally on your machine with node.js bindings for llama.cpp. Enforce a JSON schema on the model output on the generation level
Run AI models locally on your machine with node.js bindings for llama.cpp. Enforce a JSON schema on the model output on the generation level
Anthropic provider for Agenite
AWS Bedrock provider for Agenite
A MCP implementation for Agenite
Ollama provider for Agenite
OpenAI provider for Agenite
A middleware to pretty print the logs for Agenite
Tool interface for Agenite
Context-aware tool retrieval for language models - unlock the full potential of LLM function calling without context window limitations or constraints.
OpenServ Agent SDK - Create AI agents easily
Transform TypeScript classes into JSON Schema definitions with automatic support for OpenAI, Anthropic, and Google Gemini function calling (tool) formats
A scalable tools or functions orchestration SDK for LLM agents using OpenAI's /v1/responses API with memory, hooks, and tool planning support. Alternative of MCP for LLM tools orchestration.
typescript client for humanlayer.dev
Google AI integration for Robota SDK - Gemini Pro, Gemini Flash, function calling, and tool integration with Google's Generative AI
Fizz - the function wizard for LLMs.
AI-powered Git automation with intelligent commit decisions using Gemini function calling, smart diff optimization, push control, and enhanced interactive terminal session with persistent command history
MCP server for Grok AI API integration
An orchestration framework for Large Language Models (LLM) with TypeScript support