**Concise Description:** MCP-to-OpenAPI proxy. Securely exposes Model Context Protocol data via OpenAPI.
# ⚡️ mcpo: Instant OpenAPI Proxy for MCP Servers
Expose any Model Context Protocol (MCP) tool as an OpenAPI-compatible HTTP server—instantly.
mcpo is a lightweight proxy that transforms your MCP server commands into standard RESTful OpenAPI endpoints. This allows your tools to seamlessly integrate with LLM agents and applications that expect OpenAPI servers.
**No custom protocol. No glue code. No hassle.**
## 🤔 Why Use mcpo Instead of Native MCP?
MCP servers typically communicate over raw stdio, which presents several challenges:
- **🔓 Security Risks:** Inherently insecure due to lack of authentication and encryption.
- **❌ Incompatibility:** Difficult to integrate with most modern tools and platforms.
- **🧩 Missing Features:** Lacks standard features like documentation, authentication, and robust error handling.
mcpo addresses these issues effortlessly:
- **✅ OpenAPI Compatibility:** Works instantly with OpenAPI tools, SDKs, and UIs.
- **🛡 Enhanced Security:** Leverages trusted web standards for security, stability, and scalability.
- **🧠 Auto-Generated Documentation:** Automatically generates interactive OpenAPI documentation for each tool, eliminating manual configuration. Access it at `/docs` endpoint.
- **🔌 Pure HTTP:** Uses standard HTTP protocol—no sockets, no custom code, no surprises.
While it might seem like an extra step, mcpo simplifies integration and delivers better outcomes.
**mcpo makes your AI tools usable, secure, and interoperable—right now, with zero hassle.**
## 🚀 Quick Usage
We recommend using [uv](https://github.com/astral-sh/uv) for lightning-fast startup and zero configuration.
```bash
uvx mcpo --port 8000 --api-key "top-secret" -- your_mcp_server_command
Alternatively, if you're using Python:
pip install mcpo
mcpo --port 8000 --api-key "top-secret" -- your_mcp_server_command
To use an MCP server that supports Server-Sent Events (SSE), specify the server type and endpoint:
mcpo --port 8000 --api-key "top-secret" --server-type "sse" -- http://127.0.0.1:8001/sse
You can also provide custom headers for the SSE connection:
mcpo --port 8000 --api-key "top-secret" --server-type "sse" --headers '{"Authorization": "Bearer token", "X-Custom-Header": "value"}' -- http://127.0.0.1:8001/sse
To use an MCP server that supports Streamable HTTP, specify the server type and endpoint:
mcpo --port 8000 --api-key "top-secret" --server-type "streamable_http" -- http://127.0.0.1:8002/mcp
You can also run mcpo via Docker without any local installation:
docker run -p 8000:8000 ghcr.io/open-webui/mcpo:main --api-key "top-secret" -- your_mcp_server_command
uvx mcpo --port 8000 --api-key "top-secret" -- uvx mcp-server-time --local-timezone=America/New_York
Your MCP tool is now accessible at http://localhost:8000
with a generated OpenAPI schema. Test it live at http://localhost:8000/docs.
🤝 To integrate with Open WebUI after launching the server, check our docs.
For serving multiple MCP tools, you can use a configuration file adhering to the Claude Desktop format.
Start mcpo with the config file:
mcpo --config /path/to/config.json
Example config.json
:
{
"mcpServers": {
"memory": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-memory"]
},
"time": {
"command": "uvx",
"args": ["mcp-server-time", "--local-timezone=America/New_York"]
},
"mcp_sse": {
"type": "sse",
"url": "http://127.0.0.1:8001/sse",
"headers": {
"Authorization": "Bearer token",
"X-Custom-Header": "value"
}
},
"mcp_streamable_http": {
"type": "streamable_http",
"url": "http://127.0.0.1:8002/mcp"
}
}
}
Each tool will be accessible under its own unique route:
http://localhost:8000/memory
http://localhost:8000/time
Each route has a dedicated OpenAPI schema and proxy handler. Access the full schema UI at: http://localhost:8000/<tool>/docs
(e.g., /memory/docs
, /time/docs
).
To contribute or run tests locally:
# Clone the repository
git clone https://github.com/open-webui/mcpo.git
cd mcpo
# Install dependencies (including dev dependencies)
uv sync --dev
uv run pytest
mcpo
with your local modifications from a specific branch (e.g., my-feature-branch
):# Ensure you are on your development branch
git checkout my-feature-branch
# Make your code changes in the src/mcpo directory or elsewhere
# Run mcpo using uv, which will use your local, modified code
# This command starts mcpo on port 8000 and proxies your_mcp_server_command
uv run mcpo --port 8000 -- your_mcp_server_command
# Example with a test MCP server (like mcp-server-time):
# uv run mcpo --port 8000 -- uvx mcp-server-time --local-timezone=America/New_York
This allows you to test your changes interactively before committing or creating a pull request. Access your locally running mcpo
instance at http://localhost:8000
and the auto-generated docs at http://localhost:8000/docs
.MIT
We welcome and strongly encourage contributions from the community!
Whether you're fixing a bug, adding features, improving documentation, or just sharing ideas—your input is incredibly valuable and helps make mcpo better for everyone.
Getting started is easy:
Not sure where to start? Feel free to open an issue or ask a question—we’re happy to help you find a good first task.
✨ Let's build the future of interoperable AI tooling together!
```
open-webui/mcpo
March 30, 2025
July 7, 2025
Python