Browse and discover Model Context Protocol compatible clients.
This repository provides a TypeScript implementation of a Model Context Protocol (MCP) client, enabling LLM agents to interact with MCP servers through stdio or HTTP+SSE transports, supporting resources, tools, prompts, and sampling.
mcpx-py is a Python library that enables interaction with various Large Language Models (LLMs) using mcp.run tools, supporting models from PydanticAI, OpenAI, Ollama, and Gemini. mcpx-py is a Python library
This repository provides an MCP server that integrates Ollama with Claude Desktop, enabling users to list models, get model information, and ask questions to specified models using the Model Context Protocol.
MCPSwiftWrapper is a lightweight wrapper for mcp-swift-sdk, designed to integrate with LLM Swift libraries like SwiftOpenAI and SwiftAnthropic. It simplifies the creation of MCP Clients within macOS applications.
This React-based demo showcases an MCP client interacting with SSE servers, enabling tool calls and text completion, though it's still under development and has limitations regarding tool naming and concurrent calls.
Desktop4mistral is a Python-based desktop application offering a user-friendly interface for interacting with Mistral AI models, featuring model selection, chat history, command support, and Markdown support. Desktop4mistral is a Python-based
This repository provides a TypeScript-based Model Context Protocol (MCP) client that integrates with LangChain ReAct Agent, enabling interaction with LLMs like Anthropic, OpenAI, and Groq through MCP servers.
This repository showcases a simple Model Context Protocol (MCP) client using LangChain and TypeScript, demonstrating how to convert MCP server tools into LangChain-compatible tools for use with LLMs like Anthropic's Claude.
This repository provides a LangChain client for the Model Context Protocol, enabling seamless connection to MCP servers. It converts MCP tools for use with LangChain, facilitating dynamic conversations.
This repository provides a Python-based Model Context Protocol (MCP) client using LangChain, enabling interaction with MCP servers through LangChain ReAct Agent and supporting LLMs from Anthropic, OpenAI, and Groq.
Litemcp simplifies SDK adoption into MCP by providing a lightweight client for integrating AI SDKs like LangChain and Agent SDK, emphasizing simplicity, flexibility, and minimal dependencies. Litemcp simplifies SDK adoption
This repository provides an example client implementation using the Vercel AI SDK and Model Context Protocol SDK to streamline LLM chat interactions within a browser environment, including tool discovery and usage.
**NutriPlan AI** is a meal planning app that combines food image detection, calorie prediction, and personalized meal suggestions. Using TensorFlow for image analysis and Anthropic Claude AI for creat
MCP2SSE Proxy ClaudeAI client
Dream Chronicles is an AI-powered app that interprets dreams by transforming user descriptions into insightful text and audio results. It utilizes Anthropic Claude 3.5 for dream analysis and Google Te