
LangChain agents just got a whole lot more powerful — and interoperable.
With the rise of the Model Context Protocol (MCP), developers are rapidly publishing tool servers that expose powerful capabilities through a standardized protocol. Meanwhile, the LangChain ecosystem has become a go-to framework for building agents and orchestrated workflows with language models.
The new langchain-mcp-adapters package brings these two worlds together. In this article, I’ll show how easy it is to reuse an existing MCP server—in this case, a Box MCP server—and turn it into a LangChain-compatible agent using the LangChain MCP adapter. No need to write glue code or reinvent the wheel.
Here is a video about this content:
What Is MCP and why use MCP adapters?
- MCP (Model Context Protocol) is a protocol developed by Anthropic to enable structured interactions between language models and tool servers.
The LangChain MCP adapter lets you:
- Convert MCP tools into LangChain/LangGraph-compatible ones.
- Interact with tools across multiple MCP servers.
- Seamlessly use hundreds of existing MCP tools inside LangChain agents.
This means you can now bring any MCP-compatible server — including custom tools you or others have already built — into the LangChain ecosystem in just a few lines of code.
What the Box MCP server does?
In this demo, our Box MCP server provides structured data extraction capabilities. It wraps Box AI endpoints and exposes tools like box_ai_ask, making it easy for agents to parse unstructured documents stored in Box.
Code Walkthrough
Here’s the gist of how we wire it up:
from langchain_mcp_adapters.tools import load_mcp_tools
from langgraph.prebuilt import create_react_agent
from mcp.client.stdio import stdio_client
# Start the MCP server locally using uvicorn
server_params = StdioServerParameters(
command="uv",
args=[
"--directory",
"/Users/rbarbosa/Documents/code/python/box/mcp-server-box",
"run",
"src/mcp_server_box.py",
],
)
# Connect and load tools
async with stdio_client(server_params) as (read, write):
async with ClientSession(read, write) as session:
await session.initialize()
tools = await load_mcp_tools(session)
agent = create_react_agent(ChatOpenAI(model="gpt-4o"), tools)That’s it — your Box tools are now LangChain-compatible.
What about LangGraph?
We got you covered:
from contextlib import asynccontextmanager
from langchain_mcp_adapters.client import MultiServerMCPClient
from langchain_openai import ChatOpenAI
from langgraph.prebuilt import create_react_agent
model = ChatOpenAI(model="gpt-4o")
@asynccontextmanager
async def make_graph():
async with MultiServerMCPClient(
{
"box_mcp": {
# "command": "python",
# # Make sure to update to the full absolute path to your math_server.py file
# "args": ["/path/to/math_server.py"],
"command": "uv",
"args": [
"--directory",
"/code/python/box/mcp-server-box",
"run",
"src/mcp_server_box.py",
],
"transport": "stdio",
},
}
) as client:
agent = create_react_agent(model, client.get_tools())
yield agentRun it using:
uv run langgraph dev --config src/langgraph.json
Why this matters
This setup makes it trivial to:
- Reuse any existing MCP tool server inside LangChain or LangGraph agents
- Avoid duplicating code or writing custom tool wrappers
- Chain multiple tool servers together (e.g., Box + Math + CRM tools) in a single agent
- Combine tools from different servers into a single agent
- Build LangGraph workflows that use MCP tools
If you already have useful tools exposed via MCP, this adapter makes them instantly accessible to LangChain workflows.
Try it yourself
Check out the LangChain Box MCP adapter GitHub repo and give it a spin with your own tool servers — or reuse the Box one I used here.


