🦀 Prevents outdated Rust code suggestions from AI assistants. This MCP server fetches current crate docs, uses embeddings/LLMs, and provides accurate context via a tool call.
⭐ Like this project? Please
star the repository on
GitHub to show your support and stay updated! ⭐
Modern AI-powered coding assistants (like Cursor, Cline, Roo Code, etc.) excel
at understanding code structure and syntax but often struggle with the specifics
of rapidly evolving libraries and frameworks, especially in ecosystems like Rust
where crates are updated frequently. Their training data cutoff means they may
lack knowledge of the latest APIs, leading to incorrect or outdated code
suggestions.
This MCP server addresses this challenge by providing a focused, up-to-date
knowledge source for a specific Rust crate. By running an instance of this
server for a crate (e.g., serde
, tokio
, reqwest
), you give your LLM coding
assistant a tool (query_rust_docs
) it can use before writing code related to
that crate.
When instructed to use this tool, the LLM can ask specific questions about the
crate's API or usage and receive answers derived directly from the current
documentation. This significantly improves the accuracy and relevance of the
generated code, reducing the need for manual correction and speeding up
development.
Multiple instances of this server can be run concurrently, allowing the LLM
assistant to access documentation for several different crates during a coding
session.
This server fetches the documentation for a specified Rust crate, generates
embeddings for the content, and provides an MCP tool to answer questions about
the crate based on the documentation context.
text-embedding-3-small
model to find thegpt-4o-mini-2024-07-18
model to~/.local/share/rustdocs-mcp-server/
or similar)OPENAI_API_KEY
The recommended way to install is to download the pre-compiled binary for your
operating system from the
GitHub Releases page.
.zip
for Windows, .tar.gz
forrustdocs_mcp_server
(or rustdocs_mcp_server.exe
) binary.PATH
environment/usr/local/bin
, ~/bin
).If you prefer to build from source, you will need the
Rust Toolchain installed.
git clone https://github.com/Govcraft/rust-docs-mcp-server.git
cd rust-docs-mcp-server
cargo build --release
Important Note for New Crates:
When using the server with a crate for the first time (or with a new version/feature set), it needs to download the documentation and generate embeddings. This process can take some time, especially for crates with extensive documentation, and requires an active internet connection and OpenAI API key.
It is recommended to run the server once directly from your command line for any new crate configuration before adding it to your AI coding assistant (like Roo Code, Cursor, etc.). This allows the initial embedding generation and caching to complete. Once you see the server startup messages indicating it's ready (e.g., "MCP Server listening on stdio"), you can shut it down (Ctrl+C). Subsequent launches, including those initiated by your coding assistant, will use the cached data and start much faster.
The server is launched from the command line and requires the Package ID
Specification for the target crate. This specification follows the format used
by Cargo (e.g., crate_name
, crate_name@version_req
). For the full
specification details, see man cargo-pkgid
or the
Cargo documentation.
Optionally, you can specify required crate features using the -F
or--features
flag, followed by a comma-separated list of features. This is
necessary for crates that require specific features to be enabled forcargo doc
to succeed (e.g., crates requiring a runtime feature likeasync-stripe
).
# Set the API key (replace with your actual key)
export OPENAI_API_KEY="sk-..."
# Example: Run server for the latest 1.x version of serde
rustdocs_mcp_server "serde@^1.0"
# Example: Run server for a specific version of reqwest
rustdocs_mcp_server "reqwest@0.12.0"
# Example: Run server for the latest version of tokio
rustdocs_mcp_server tokio
# Example: Run server for async-stripe, enabling a required runtime feature
rustdocs_mcp_server "async-stripe@0.40" -F runtime-tokio-hyper-rustls
# Example: Run server for another crate with multiple features
rustdocs_mcp_server "some-crate@1.2" --features feat1,feat2
On the first run for a specific crate version and feature set, the server
will:
cargo doc
(with specified features).async-stripe
with over 5000Subsequent runs for the same crate version and feature set will load the data
from the cache, making startup much faster.
The server communicates using the Model Context Protocol over standard
input/output (stdio). It exposes the following:
query_rust_docs
{
"type": "object",
"properties": {
"question": {
"type": "string",
"description": "The specific question about the crate's API or usage."
}
},
"required": ["question"]
}
From <crate_name> docs:
.{
"jsonrpc": "2.0",
"method": "callTool",
"params": {
"tool_name": "query_rust_docs",
"arguments": {
"question": "How do I make a simple GET request with reqwest?"
}
},
"id": 1
}
crate://<crate_name>
crate://<crate_name>
(e.g., crate://serde
, crate://reqwest
)logging/message
notifications.You can configure MCP clients like Roo Code to run multiple instances of this
server, each targeting a different crate. Here's an example snippet for Roo
Code's mcp_settings.json
file, configuring servers for reqwest
andasync-stripe
(note the added features argument for async-stripe
):
{
"mcpServers": {
"rust-docs-reqwest": {
"command": "/path/to/your/rustdocs_mcp_server",
"args": [
"reqwest@0.12"
],
"env": {
"OPENAI_API_KEY": "YOUR_OPENAI_API_KEY_HERE"
},
"disabled": false,
"alwaysAllow": []
},
"rust-docs-async-stripe": {
"command": "rustdocs_mcp_server",
"args": [
"async-stripe@0.40",
"-F",
" runtime-tokio-hyper-rustls"
],
"env": {
"OPENAI_API_KEY": "YOUR_OPENAI_API_KEY_HERE"
},
"disabled": false,
"alwaysAllow": []
}
}
}
Note:
/path/to/your/rustdocs_mcp_server
with the actual path to theYOUR_OPENAI_API_KEY_HERE
with your actual OpenAI API key.rust-docs-reqwest
, rust-docs-async-stripe
) are arbitrary namesFor Claude Desktop users, you can configure the server in the MCP settings.
Here's an example configuring servers for serde
and async-stripe
:
{
"mcpServers": {
"rust-docs-serde": {
"command": "/path/to/your/rustdocs_mcp_server",
"args": [
"serde@^1.0"
]
},
"rust-docs-async-stripe-rt": {
"command": "rustdocs_mcp_server",
"args": [
"async-stripe@0.40",
"-F",
"runtime-tokio-hyper-rustls"
]
}
}
}
Note:
rustdocs_mcp_server
is in your system's PATH or provide the full path/path/to/your/rustdocs_mcp_server
).rust-docs-serde
, rust-docs-async-stripe-rt
) are arbitrary namesOPENAI_API_KEY
environment variable where Claude Desktop-F
argument for crates like async-stripe
~/.local/share/rustdocs-mcp-server/<crate_name>/<sanitized_version_req>/<features_hash>/embeddings.bin
.sanitized_version_req
is derived from the version requirement, andfeatures_hash
is a hash representing the specific combination of featuresbincode
serialization.clap
.Cargo.toml
.cargo doc
using the cargo
library API to generate HTMLtarget/doc
byindex.html
.scraper
crate to parse each HTML file and extract text content<section id="main-content">
).async-openai
crate and tiktoken-rs
to generate embeddings fortext-embedding-3-small
model.bincode
.RustDocsServer
with thermcp
over stdio.query_rust_docs
tool):gpt-4o-mini-2024-07-18
model via the OpenAI API.This project is licensed under the MIT License.
Copyright (c) 2025 Govcraft
If you find this project helpful, consider sponsoring the development!
Govcraft/rust-docs-mcp-server
March 26, 2025
July 7, 2025
Rust