Auto-generate Model Context Protocol (MCP) servers from any API in seconds.
Point MCPForge at an OpenAPI spec, get a ready-to-run Python MCP server that works with Claude Desktop, Cursor, Zed, and any MCP-compatible AI tool.
The Model Context Protocol lets AI assistants use tools — but writing MCP servers by hand is tedious boilerplate. MCPForge generates them automatically from existing API documentation.
# Before MCPForge: hand-write 200+ lines of MCP boilerplate
# After MCPForge:
mcpforge generate https://api.github.com/openapi.json
# → github_mcp.py (32 tools, ready to run)pip install mcpforge
# Generate from an OpenAPI spec URL
mcpforge generate https://petstore.swagger.io/v2/swagger.json
# Run the generated MCP server
python petstore_mcp.py
# Or try a built-in demo
mcpforge demo githubThat's it. Add the server to Claude Desktop and your AI can now use the Petstore API.
pip install mcpforgeOr from source:
git clone https://github.com/acunningham-ship-it/mcpforge.git
cd mcpforge
pip install -e .Requirements:
- Python 3.10+
- Ollama (optional — for AI-enhanced descriptions)
# Generate MCP server from OpenAPI URL
mcpforge generate https://api.example.com/openapi.json
# Generate from local file
mcpforge generate ./my-api-spec.yaml
# Specify output file
mcpforge generate ./spec.yaml --output my_server.py
# Use Ollama to enhance tool descriptions
mcpforge generate ./spec.yaml --enhance
# List available examples
mcpforge examples
# Run a demo MCP server
mcpforge demo github
mcpforge demo weather
mcpforge demo calculatorMCPForge parses OpenAPI 3.0 / Swagger 2.0 specs — from URLs, local files, or raw YAML/JSON.
For each endpoint, it extracts:
- Path, method, operationId
- Parameters (path, query, header, body)
- Summary and description
- Authentication requirements (Bearer, API key, Basic)
- Response schema
If Ollama is running locally, MCPForge uses a local model to:
- Rewrite terse technical descriptions into clear, AI-friendly tool descriptions
- Identify the most important parameters
- Generate usage examples
This improves how well AI assistants understand what each tool does.
Produces a single self-contained Python file using the official MCP Python SDK:
#!/usr/bin/env python3
"""MCP server for GitHub API — auto-generated by MCPForge"""
import os
import httpx
from mcp.server.fastmcp import FastMCP
mcp = FastMCP("GitHub API")
BASE_URL = "https://api.github.com"
GITHUB_TOKEN = os.environ.get("GITHUB_TOKEN", "")
@mcp.tool()
def list_repos(owner: str, per_page: int = 30) -> dict:
"""List public repositories for a GitHub user."""
headers = {"Authorization": f"Bearer {GITHUB_TOKEN}"} if GITHUB_TOKEN else {}
resp = httpx.get(f"{BASE_URL}/users/{owner}/repos", params={"per_page": per_page}, headers=headers)
resp.raise_for_status()
return resp.json()
# ... more tools ...
if __name__ == "__main__":
mcp.run()Checks the generated file for syntax errors and verifies MCP structure before writing to disk.
Edit ~/Library/Application Support/Claude/claude_desktop_config.json:
{
"mcpServers": {
"github": {
"command": "python",
"args": ["/path/to/github_mcp.py"],
"env": {
"GITHUB_TOKEN": "your-github-token"
}
}
}
}Restart Claude Desktop. Your AI can now use the GitHub API.
MCPForge includes a web interface for generating servers without the CLI:
mcpforge serve
# Open http://localhost:8000Paste any OpenAPI spec URL, click Generate, copy the result.
| Example | Description | Tools |
|---|---|---|
github |
GitHub public API | list_repos, get_repo, search_repos, list_issues |
weather |
wttr.in weather (no key needed) | get_weather, get_forecast |
calculator |
Math operations | add, subtract, multiply, divide, calculate |
# Run any example immediately
mcpforge demo github
mcpforge demo weather
mcpforge demo calculatorMCPForge exposes a REST API for programmatic generation:
# Start the API server
mcpforge serve
# Generate via API
curl -X POST http://localhost:8000/api/generate \
-H "Content-Type: application/json" \
-d '{"spec_url": "https://petstore.swagger.io/v2/swagger.json", "enhance": false}'
# Response: {"server_code": "...", "tools": [...], "api_name": "Petstore"}MCPForge works with zero configuration. Optionally create mcpforge.yaml:
ollama_url: "http://localhost:11434"
enhance_model: "qwen2.5:7b" # Model to use for description enhancement
max_tools: 50 # Max tools per generated server
output_dir: "./generated" # Default output directorymcpforge/
├── mcpforge/ # Core library
│ ├── parser.py # OpenAPI 3.0 / Swagger 2.0 parser
│ ├── generator.py # MCP server code generator
│ ├── enhancer.py # Ollama description enhancer
│ └── validator.py # Generated code validator
├── cli/main.py # Typer CLI (mcpforge command)
├── api/main.py # FastAPI web API + UI
├── examples/ # Built-in MCP server examples
│ ├── github_mcp.py
│ ├── weather_mcp.py
│ └── calculator_mcp.py
├── tests/ # pytest test suite
└── docs/ # GitHub Pages website
| Feature | Status |
|---|---|
| OpenAPI 3.0 | ✅ |
| Swagger 2.0 | ✅ |
| Bearer auth | ✅ |
| API key auth | ✅ |
| Path parameters | ✅ |
| Query parameters | ✅ |
| Request body | ✅ |
| Ollama enhancement | ✅ optional |
| Syntax validation | ✅ |
| Web UI | ✅ |
| Docker | ✅ |
- Model Context Protocol — Official MCP site
- MCP Python SDK — The SDK MCPForge uses
- MCP Servers — Community servers
- Claude Desktop MCP Guide — Connect to Claude
MIT — see LICENSE
Built with Python, FastAPI, and the MCP Python SDK.