A comprehensive, asynchronous client for the Model Context Protocol (MCP). It bridges the gap between powerful AI models like Anthropic's Claude and a universe of external tools, local/remote servers, and contextual data sources, enabling complex, stateful interactions.
Built by Jeffrey Emanuel
The Model Context Protocol (MCP) standardizes how AI models interact with external capabilities (tools, resources, prompts). This client aims to be the ultimate interface for leveraging MCP, providing:
stdio
, sse
) with built-in resilience, advanced error handling, and intelligent discovery.This project tackles significant engineering challenges, especially around reliable stdio
communication, asynchronous state management, and providing a consistent experience across both CLI and Web interfaces, to deliver a truly comprehensive MCP client solution.
/commands
, autocompletion, Rich Markdown rendering) and batch-mode operation via Typer. Includes a live Textual User Interface (TUI) dashboard for monitoring.stdio
, sse
(HTTP Server-Sent Events), and streaming-http
MCP servers.RobustStdioSession
designed to gracefully handle potentially noisy or non-compliant stdio
servers. It includes:stdio
server processes.StdioProtectionWrapper
, safe_stdout
context manager, get_safe_console()
function) prevents accidental writes to sys.stdout
from corrupting the stdio
channel, ensuring safe coexistence of multiple stdio
servers and user output (redirected to stderr
when necessary). This was crucial for stability.@retry_with_circuit_breaker
). Background health monitoring (ServerMonitor
) checks server responsiveness.streaming-http
transport protocol via the FastMCP library. This transport provides efficient bidirectional communication over HTTP with built-in streaming capabilities, making it ideal for modern MCP server deployments.stdio
/sse
or /events
paths β sse
streaming-http
(default for modern servers)stdio
servers (Python/JS scripts) in configured filesystem paths._mcp._tcp.local.
) on the local network. Interactive commands (/discover list
, /discover connect
, etc.) for managing LAN servers.initialize
handshake. Supports detection of all transport types (stdio
, sse
, streaming-http
). Configurable via /config port-scan ...
commands.claude_desktop_config.json
in the project root. Intelligently adapts configurations:wsl.exe ... bash -c "cmd"
calls to direct Linux shell execution (/bin/bash -c "cmd"
).C:\...
) in arguments to their Linux/WSL equivalents (/mnt/c/...
) using adapt_path_for_platform
for seamless integration.anthropic
SDK, supporting multi-turn tool use scenarios.Rich
rendering (CLI). Handles complex streaming events, including partial JSON input accumulation (input_json_delta
) for tools requiring structured input./tool
command (CLI) or a dedicated modal (Web UI).ConversationGraph
) allow exploring different interaction paths without losing history. Visually represented and navigable in the Web UI./optimize
) of long conversation histories using a specified AI model (configurable) to stay within context limits./prompt
command./export
) and load (/import
) entire conversation branches in a portable JSON format for sharing or backup.opentelemetry-sdk
) and tracing (spans) for monitoring client operations, server requests, and tool execution performance. Console exporters can be enabled for debugging./dashboard
) built with Rich
, showing server health, tool usage stats, and client info.diskcache
) and in-memory caching for tool results to improve speed and reduce costs.weather
, filesystem
).stock:lookup
) can automatically invalidate dependent caches (e.g., stock:analyze
). View the graph via /cache dependencies
.A glimpse into the Ultimate MCP Client's interfaces:
<br/> <table> <tbody> <tr> <td align="center" valign="top" width="50%"> <img src="https://github.com/Dicklesworthstone/ultimate_mcp_client/blob/main/screenshots/terminal_example_01.webp?raw=true" alt="CLI Interactive Mode showing tool execution and streaming" width="95%"> <br/> <p align="center"><small><em>Interactive CLI: Streaming response with tool call/result.</em></small></p> </td> <td align="center" valign="top" width="50%"> <img src="https://github.com/Dicklesworthstone/ultimate_mcp_client/blob/main/screenshots/terminal_example_02.webp?raw=true" alt="CLI TUI Dashboard showing server status" width="95%"> <br/> <p align="center"><small><em>Live TUI Dashboard: Real-time server & tool monitoring (<code>/dashboard</code>).</em></small></p> </td> </tr> <tr> <td align="center" valign="top" width="33%"> <img src="https://github.com/Dicklesworthstone/ultimate_mcp_client/blob/main/screenshots/webui_example_01.webp?raw=true" alt="Web UI Chat Interface" width="95%"> <br/> <p align="center"><small><em>Web UI: Main chat interface showing messages and tool interactions.</em></small></p> </td> <td align="center" valign="top" width="33%"> <img src="https://github.com/Dicklesworthstone/ultimate_mcp_client/blob/main/screenshots/webui_example_02.webp?raw=true" alt="Web UI Server Management Tab" width="95%"> <br/> <p align="center"><small><em>Web UI: Server management tab with connection status and controls.</em></small></p> </td> <td align="center" valign="top" width="33%"> <img src="https://github.com/Dicklesworthstone/ultimate_mcp_client/blob/main/screenshots/webui_example_03.webp?raw=true" alt="Web UI Conversation Branching View" width="95%"> <br/> <p align="center"><small><em>Web UI: Conversation tab showing the branching graph structure.</em></small></p> </td> </tr> </tbody> </table><br/>Note: Some screenshots feature tools like
llm_gateway:generate_completion
. These are provided by the LLM Gateway MCP Server, another project by the same author. This server acts as an MCP-native gateway, enabling advanced agents (like Claude, used by this client) to intelligently delegate tasks to various other LLMs (e.g., Gemini, GPT-4o-mini), often optimizing for cost and performance.
Requires Python 3.13+
First, install uv (the recommended fast Python package installer):
# macOS / Linux
curl -LsSf https://astral.sh/uv/install.sh | sh
# Windows
powershell -c "irm https://astral.sh/uv/install.ps1 | iex"
Then clone the repository, set up a virtual environment using Python 3.13+, and install packages:
git clone https://github.com/Dicklesworthstone/ultimate_mcp_client
cd ultimate_mcp_client
# Create venv using uv (recommended)
uv venv --python 3.13
# Or using standard venv
# python3.13 -m venv .venv
# Activate environment
source .venv/bin/activate # Linux/macOS
# .venv\Scripts\activate # Windows Powershell
# .\venv\Scripts\activate.bat # Windows CMD
# Install dependencies using uv (fastest)
uv sync --all-extras
# Or using pip (slower)
# pip install -e . # Installs only core dependencies
Set your Anthropic API key as an environment variable:
export ANTHROPIC_API_KEY="sk-ant-..."
# Or add ANTHROPIC_API_KEY="sk-ant-..." to a .env file in the project root
Alternatively, set it later using the /config api-key ...
command in the interactive CLI or via the Web UI settings panel.
mcpclient run --webui
Then open your browser to http://127.0.0.1:8017
(or the configured host/port).
You can customize the host and port:
mcpclient run --webui --host 0.0.0.0 --port 8080
mcpclient run --interactive
mcpclient run --query "What's the weather in New York?"
mcpclient run --dashboard
# Enable port scanning (if disabled)
mcpclient config port-scan enable true
# Set the range (example)
mcpclient config port-scan range 8000 8999
# Export current conversation branch to a default filename
mcpclient export
# Export current branch to a specific file
mcpclient export --output my_conversation.json
# Export specific conversation branch by ID (first 8 chars often suffice)
mcpclient export --id 12345678 --output specific_branch.json
# Import a conversation file (creates a new branch under the current node)
mcpclient import-conv my_conversation.json
The web UI (mcpclient run --webui
) provides a modern, user-friendly interface built with Alpine.js, Tailwind CSS, and DaisyUI:
Marked.js
), code block syntax highlighting (via highlight.js
) with copy buttons, and clear display of system messages and status updates.ConversationGraph
. Users can click to checkout different branches (updating the chat view) or fork the current point to create new branches.When running with --webui
, a FastAPI server provides programmatic access:
GET /api/status - Client overview (model, servers, tools, history count, current node)
GET /api/config - Get current (non-sensitive) configuration
PUT /api/config - Update configuration settings (e.g., { "temperature": 0.8 })
GET /api/servers - List all configured servers with status, health, tools
POST /api/servers - Add a new server configuration (Supports stdio/sse/streaming-http transports)
DELETE /api/servers/{server_name} - Remove a server configuration
POST /api/servers/{server_name}/connect - Connect to a specific server
POST /api/servers/{server_name}/disconnect - Disconnect from a specific server
PUT /api/servers/{server_name}/enable?enabled={true|false} - Enable/disable a server
GET /api/tools - List all available tools from connected servers
GET /api/resources - List all available resources
GET /api/prompts - List all available prompts
GET /api/conversation - Get current conversation state (messages, current node ID/name, full node graph)
POST /api/conversation/fork - Create a fork (optionally named) from the current conversation node
POST /api/conversation/checkout - Switch the current context to a different conversation node/branch (by ID)
POST /api/conversation/clear - Clear messages on the current node and switch to root
POST /api/conversation/optimize - Trigger context summarization for the current node (optional model/token args)
POST /api/tool/execute - Execute a specific tool with given parameters (Requires ToolExecuteRequest model)
WS /ws/chat - WebSocket endpoint for streaming chat (`query`, `command`), receiving updates (`text_chunk`, `status`, `query_complete`, `error`, `state_update`).
(Note: The actual request/response models like ServerAddRequest
, ToolExecuteRequest
are defined in the Python code and used by FastAPI for validation.)
Run mcpclient --help
or mcpclient [COMMAND] --help
for details.
mcpclient run --interactive
)Type /
followed by a command:
/help Show this help message
/exit, /quit Exit the client
/config Manage configuration (api-key, model, max-tokens, history-size, auto-discover, discovery-path, port-scan)
/servers Manage MCP servers (list, add, remove, connect, disconnect, enable, disable, status)
/discover Discover/manage LAN servers (list, connect <NAME>, refresh, auto <on|off>)
/tools List available tools (optionally filter by server: /tools <server_name>)
/tool Directly execute a tool: /tool <tool_name> '{"param": "value"}'
/resources List available resources (optionally filter by server: /resources <server_name>)
/prompts List available prompt templates (optionally filter by server: /prompts <server_name>)
/prompt Apply a prompt template to the current conversation context: /prompt <prompt_name>
/model View or change the current AI model: /model [<model_name>]
/fork Create a new branch from the current conversation point: /fork [Optional Branch Name]
/branch Manage branches (list, checkout <node_id_prefix>)
/export Export current branch: /export [--id <node_id>] [--output <file.json>]
/import Import conversation file: /import <file.json>
/history View recent conversation history (optionally specify number: /history 10)
/cache Manage tool cache (list, clear [--all|tool_name], clean, dependencies [tool_name])
/dashboard Show the live Textual User Interface (TUI) dashboard (requires separate run)
/optimize Summarize current conversation context: /optimize [--model <model>] [--tokens <num>]
/reload Disconnect, reload capabilities, and reconnect to enabled servers
/clear Clear messages in the current branch and optionally reset to root
This client employs several techniques to provide a robust and feature-rich experience:
asyncio
for efficient handling of network I/O (HTTP, WebSockets, SSE), subprocess communication (stdio
), filesystem operations (aiofiles
), and concurrent background tasks (monitoring, discovery, port scanning).MCPClient
: The main application class, orchestrating UI loops, command handling, and core logic.ServerManager
: Handles server configuration, lifecycle (connecting, disconnecting, restarting), discovery mechanisms, capability aggregation (tools, resources, prompts), and manages server processes/sessions. Uses AsyncExitStack
for reliable resource cleanup. Supports all transport types including the modern streaming-http
protocol via FastMCP integration.RobustStdioSession
(Key Engineering Effort): A custom implementation of the mcp.ClientSession
tailored specifically for stdio
servers. It includes logic to:stdout
to prevent protocol errors.asyncio.Future
objects associated with request IDs, offering a potentially more performant alternative to queue-based approaches.stdout
and writing captured stderr
to log files asynchronously.stdout
pollution, which is fatal for stdio
-based protocols:StdioProtectionWrapper
: Globally wraps sys.stdout
to intercept writes, redirecting them to stderr
if any stdio
server is active.safe_stdout()
: A context manager used during critical operations (like server connection) to temporarily redirect stdout
to stderr
.get_safe_console()
/ safe_print()
: Utility functions ensuring that UI output (via Rich
) uses the correct stream (stdout
or stderr
) based on active stdio
servers.stdio
servers.ConversationGraph
: Manages the non-linear, branching conversation structure using ConversationNode
objects. Handles persistence to/from JSON.ToolCache
: Implements caching logic using diskcache
for persistence and an in-memory layer for speed. Includes TTL management and dependency invalidation.ServerRegistry
/ ServerMonitor
: Handle mDNS/Zeroconf discovery/registration and background server health checks with recovery attempts.FastAPI
for a clean REST API structure, uvicorn
for the ASGI server, and websockets
for bidirectional real-time chat. A FastAPI lifespan
context manager ensures the MCPClient
is properly initialized on startup and cleaned up on shutdown. Dependency injection provides endpoint access to the client instance.Alpine.js
for lightweight reactivity and component logic directly in the HTML. Tailwind CSS
and DaisyUI
provide styling and pre-built components. Marked.js
, highlight.js
, and DOMPurify
handle secure and attractive rendering of Markdown and code. Tippy.js
provides tooltips.Typer
for defining the command-line interface and parsing arguments. Rich
is heavily used for formatted console output, tables, progress bars, Markdown rendering, syntax highlighting, and the live TUI dashboard. Careful management (_run_with_progress
, _run_with_simple_progress
helpers) prevents issues with nested Rich Live
displays during complex operations.@retry_with_circuit_breaker
, @with_tool_error_handling
) for common patterns like retries and standardized error reporting during tool execution. Structured try...except...finally
blocks are used throughout for robustness.OpenTelemetry
for structured metrics (request counters, latency histograms, tool executions) and distributed tracing (spans track operations like query processing, server connections, tool calls). This aids in performance analysis and debugging.config.yaml
file, environment variables (especially for sensitive keys like ANTHROPIC_API_KEY
), and interactive commands or Web UI settings that persist changes back to the YAML file.The Smart Cache Dependency system allows tools to declare dependencies on other tools:
/cache dependencies
Example dependency flow:
weather:current β weather:forecast β travel:recommendations
If the current weather data is updated, both the forecast and travel recommendations caches are automatically invalidated.
This client offers multiple ways to find and integrate MCP servers:
config.yaml
(under discovery_paths
) for local stdio
server scripts (e.g., .py
, .js
). Defaults include:.mcpclient_config/servers
(in project root)~/mcp-servers
~/modelcontextprotocol/servers
claude_desktop_config.json
): If this file exists in the project root, the client automatically imports server definitions from it during setup.wsl.exe ... <shell> -c "command ..."
patterns.<shell>
(e.g., bash
, sh
) and the "command ..."
./bin/bash -c "command ..."
), bypassing wsl.exe
.npx ...
), it uses adapt_path_for_platform
to scan arguments for Windows-style paths (C:\Users\...
) and converts them to their /mnt/c/Users/...
equivalents, ensuring compatibility when running the client in WSL/Linux.config.yaml
(registry_urls
) to discover publicly available or shared servers (typically SSE).zeroconf
library to listen for services advertised under _mcp._tcp.local.
on the local network./discover
command suite allows managing these discovered servers:/discover list
: View details of currently visible LAN servers./discover connect <NAME>
: Add a discovered server to the configuration file and attempt to connect./discover refresh
: Manually trigger a re-scan of the network./discover auto [on|off]
: Toggle continuous background mDNS scanning (requires enable_local_discovery: true
in config).enable_port_scanning: true
in config.yaml
or via /config port-scan enable true
), actively scans a range of ports on specified local IP addresses during startup discovery.initialize
handshake on each port to detect responsive MCP servers. Supports automatic detection of sse
and streaming-http
transport types based on server responses and HTTP headers.config.yaml
or the /config port-scan ...
commands.use_console_exporter = True
near the top of mcpclient.py
to enable noisy console output for debugging traces and metrics. For production, configure appropriate OTel exporters (e.g., Jaeger, Prometheus).mcpclient run --dashboard
for a live TUI monitoring view powered by Rich
. Shows server status, health, connection state, basic tool usage stats, and client info. Refreshes periodically.logging
configured with RichHandler
for pretty console output (to stderr
to avoid stdio
conflicts).--verbose
or -v
flag to increase log level to DEBUG for detailed internal information.stdio
session logging (USE_VERBOSE_SESSION_LOGGING = True
) to see raw JSON-RPC messages (useful for debugging MCP servers).MCP_CLIENT_DEBUG=1
before running to make the CLI print full Python tracebacks for unexpected errors, aiding in debugging client-side issues.stderr
output from stdio
servers is captured asynchronously and written to log files in the configuration directory (e.g., .mcpclient_config/<server_name>_stderr.log
), crucial for diagnosing server-side problems..mcpclient_config/config.yaml
located in the project's root directory.ANTHROPIC_API_KEY
is the primary way to provide the API key and overrides any key stored in the config file. EDITOR
is used by /config edit
. MCP_CLIENT_DEBUG=1
enables tracebacks./config
command allows viewing and modifying settings like api-key
, model
, max-tokens
, discovery-path
, port scanning parameters, etc. Changes are saved back to config.yaml
.config.yaml
.View Current Config:
mcpclient config --show
# OR in interactive mode:
/config
Edit Config File Manually:
mcpclient config --edit
# (This will open .mcpclient_config/config.yaml in the editor specified by your $EDITOR environment variable)
asyncio
Typer
, Rich
FastAPI
, Uvicorn
, WebSockets
, Alpine.js
, Tailwind CSS
, DaisyUI
mcp
SDK (mcp>=1.0.0
), fastmcp
(for streaming-http transport)anthropic
SDK (anthropic>=0.15.0
)opentelemetry-sdk
, opentelemetry-api
, opentelemetry-instrumentation
httpx
, PyYAML
, python-dotenv
, psutil
, aiofiles
, diskcache
, tiktoken
, zeroconf
, colorama
ruff
is configured in pyproject.toml
. Use uv run lint
or ruff check . && ruff format .
.mypy
is configured in pyproject.toml
. Use uv run typecheck
or mypy mcpclient.py
.The project is primarily structured within mcpclient.py
for easier distribution and introspection, although internal class-based modularity is maintained. The Web UI is served from the self-contained mcp_client_ui.html
file, utilizing CDN-hosted libraries for simplicity. Key complex logic, such as the robust stdio
handling and asynchronous management, resides within dedicated classes like RobustStdioSession
and ServerManager
.
MIT License. Refer to standard MIT terms.
Dicklesworthstone/ultimate_mcp_client
April 8, 2025
July 5, 2025
Python