LangGraph-powered ReAct agent with Model Context Protocol (MCP) integration. A Streamlit web interface for dynamically configuring, deploying, and interacting with AI agents capable of accessing vario
LangChain-MCP-Adapters
is a toolkit provided by LangChain AI that enables AI agents to interact with external tools and data sources through the Model Context Protocol (MCP). This project provides a user-friendly interface for deploying ReAct agents that can access various data sources and APIs through MCP tools.
ReAct Agent
with MCP toolsThe Model Context Protocol (MCP) consists of three main components:
You can easily run this project using Docker without setting up a local Python environment.
Install Docker Desktop from the link below:
dockers
directorycd dockers
.env
file with your API keys in the project root directory.cp .env.example .env
Enter your obtained API keys in the .env
file.
(Note) Not all API keys are required. Only enter the ones you need.
ANTHROPIC_API_KEY
: If you enter an Anthropic API key, you can use "claude-3-7-sonnet-latest", "claude-3-5-sonnet-latest", "claude-3-haiku-latest" models.OPENAI_API_KEY
: If you enter an OpenAI API key, you can use "gpt-4o", "gpt-4o-mini" models.LANGSMITH_API_KEY
: If you enter a LangSmith API key, you can use LangSmith tracing.ANTHROPIC_API_KEY=your_anthropic_api_key
OPENAI_API_KEY=your_openai_api_key
LANGSMITH_API_KEY=your_langsmith_api_key
LANGSMITH_TRACING=true
LANGSMITH_ENDPOINT=https://api.smith.langchain.com
LANGSMITH_PROJECT=LangGraph-MCP-Agents
When using the login feature, set USE_LOGIN
to true
and enter USER_ID
and USER_PASSWORD
.
USE_LOGIN=true
USER_ID=admin
USER_PASSWORD=admin123
If you don't want to use the login feature, set USE_LOGIN
to false
.
USE_LOGIN=false
AMD64/x86_64 Architecture (Intel/AMD Processors)
# Run container
docker compose -f docker-compose.yaml up -d
ARM64 Architecture (Apple Silicon M1/M2/M3/M4)
# Run container
docker compose -f docker-compose-mac.yaml up -d
(Note)
git clone https://github.com/teddynote-lab/langgraph-mcp-agents.git
cd langgraph-mcp-agents
uv venv
uv pip install -r requirements.txt
source .venv/bin/activate # For Windows: .venv\Scripts\activate
.env
file with your API keys (copy from .env.example
)cp .env.example .env
Enter your obtained API keys in the .env
file.
(Note) Not all API keys are required. Only enter the ones you need.
ANTHROPIC_API_KEY
: If you enter an Anthropic API key, you can use "claude-3-7-sonnet-latest", "claude-3-5-sonnet-latest", "claude-3-haiku-latest" models.OPENAI_API_KEY
: If you enter an OpenAI API key, you can use "gpt-4o", "gpt-4o-mini" models.LANGSMITH_API_KEY
: If you enter a LangSmith API key, you can use LangSmith tracing.ANTHROPIC_API_KEY=your_anthropic_api_key
OPENAI_API_KEY=your_openai_api_key
LANGSMITH_API_KEY=your_langsmith_api_key
LANGSMITH_TRACING=true
LANGSMITH_ENDPOINT=https://api.smith.langchain.com
LANGSMITH_PROJECT=LangGraph-MCP-Agents
When using the login feature, set USE_LOGIN
to true
and enter USER_ID
and USER_PASSWORD
.
USE_LOGIN=true
USER_ID=admin
USER_PASSWORD=admin123
If you don't want to use the login feature, set USE_LOGIN
to false
.
USE_LOGIN=false
streamlit run app.py
Visit Smithery to find useful MCP servers.
First, select the tool you want to use.
Click the COPY button in the JSON configuration on the right.
Paste the copied JSON string in the Tool JSON
section.
Click the Add Tool
button to add it to the "Registered Tools List" section.
Finally, click the "Apply" button to apply the changes to initialize the agent with the new tools.
<img src="./assets/apply-tool-configuration.png" alt="tool json" style="width: auto; height: auto;">For developers who want to learn more deeply about how MCP and LangGraph integration works, we provide a comprehensive Jupyter notebook tutorial:
This hands-on tutorial covers:
This tutorial provides practical examples with step-by-step explanations that help you understand how to build and integrate MCP tools into LangGraph agents.
MIT License
teddynote-lab/langgraph-mcp-agents
March 30, 2025
July 6, 2025
Python