huggingface-mcp-server v1.0.26
HuggingFace MCP Server
A TypeScript-based MCP (Model, Chat, Protocol) server that integrates with HuggingFace's inference endpoints to provide:
- Image generation with custom LoRA support
- Story generation
This server implements the OpenAI API protocol for tools, making it compatible with tool-enabled LLM clients.
Features
- Image Generation Tool: Generate images using Flux model (Stable Diffusion XL) with optional custom LoRA model support
- Story Generation Tool: Generate stories using LLM models from HuggingFace
- MCP Protocol Compatible: Implements the OpenAI-compatible tool protocol
- Multiple Transport Support: Use HTTP or stdio transport for maximum compatibility
- CLI Support: Run as a command-line tool with npx
- Claude & Cursor Integration: Ready-to-use MCP configuration files
Installation & Setup
Publishing to npm
Before others can use this package with npx, you need to publish it to npm:
# Create an npm account if you don't have one
npm adduser
# Login to npm
npm login
# Publish the package
npm publishWhen updating the package with new features:
- Update the version in
package.json,claude-mcp.json, andsrc/cli.ts - Build the project with
npm run build - Publish the new version with
npm publish
The prepublishOnly script will ensure the project is built before publishing.
Global Installation
Once published, you can install this package globally:
npm install -g huggingface-mcp-serverThen run it using:
hf-mcp-server --api-key YOUR_HUGGINGFACE_API_KEYUsing npx
Run it directly with npx without installing:
npx huggingface-mcp-server --api-key YOUR_HUGGINGFACE_API_KEYDevelopment Setup
- Clone this repository
- Install dependencies:
npm install Copy environment file and configure it:
cp .env.example .envUpdate the
.envfile with your HuggingFace API key.Build and run the server:
npm run build npm startOr run in development mode:
npm run dev
Claude Desktop, Cursor, and Cline Integration
This project includes MCP configuration files for easy integration with various AI assistants:
For Claude Desktop:
Use the claude-mcp.json file in the Claude Desktop MCP configuration settings.
For Cursor Code Editor:
Use the cursor-mcp.json file in the Cursor MCP settings.
For Cline:
Add the contents of cline-mcp.json to your Cline configuration:
{
"huggingface-mcp": {
"command": "npx",
"args": [
"--yes",
"huggingface-mcp-server@latest",
"--api-key=YOUR_HUGGINGFACE_API_KEY_HERE",
"--port=3000"
],
"disabled": false,
"timeout": 60
}
}Make sure to replace YOUR_HUGGINGFACE_API_KEY_HERE with your actual API key.
All configurations will start the server and require your HuggingFace API key.
CLI Options
Options:
-p, --port <number> Port to run the HTTP server on (default: "3000")
-k, --api-key <string> HuggingFace API key
-e, --env <path> Path to .env file
-t, --transport <type> Transport type (http or stdio) (default: "http")
-h, --help display help for commandExample using HTTP transport:
npx huggingface-mcp-server --port 4000 --api-key YOUR_API_KEYExample using stdio transport:
npx huggingface-mcp-server --transport stdio --api-key YOUR_API_KEYYou can also set the transport via environment variables:
TRANSPORT=stdio npx huggingface-mcp-server --api-key YOUR_API_KEYCommunication Protocols
HTTP Transport
When running in HTTP mode, the following endpoints are available:
GET /
Returns a health check message indicating the server is running.
POST /v1/tools
Returns the list of available tools:
generate_image: Generate an image with optional custom LoRAgenerate_story: Generate a story based on a prompt
POST /v1/chat/completions
Main endpoint that handles the MCP protocol for tool usage.
stdio Transport
When running in stdio mode, the server communicates using JSON messages through standard input/output:
Request Tools
{"type": "tools"}Chat Request
{
"type": "chat",
"data": {
"messages": [
{
"role": "user",
"content": "Generate an image of a cat"
}
]
}
}Exit Request
{"type": "exit"}Example Usage
Example request to generate an image:
{
"messages": [
{
"role": "user",
"content": "I want to generate an image of a cat in space"
},
{
"role": "assistant",
"tool_calls": [
{
"id": "call_123",
"type": "function",
"function": {
"name": "generate_image",
"arguments": "{\"prompt\": \"A cat in space with a space helmet, stars in background\", \"lora_name\": \"username/space-cats-lora\"}"
}
}
]
}
]
}Example request to generate a story:
{
"messages": [
{
"role": "user",
"content": "Write me a story about a space explorer"
},
{
"role": "assistant",
"tool_calls": [
{
"id": "call_456",
"type": "function",
"function": {
"name": "generate_story",
"arguments": "{\"prompt\": \"A space explorer discovers an ancient alien civilization\"}"
}
}
]
}
]
}License
MIT
10 months ago
10 months ago
10 months ago
10 months ago
10 months ago
10 months ago
10 months ago
10 months ago
10 months ago
10 months ago
10 months ago
10 months ago
10 months ago
10 months ago
10 months ago
10 months ago
10 months ago
10 months ago
10 months ago
10 months ago
10 months ago
10 months ago
10 months ago
10 months ago
10 months ago
10 months ago