MCPTool
MCPTool enables integration with external tools and services through the Model Context Protocol (MCP).
Mandatory init variables | "name": The name of the tool "server_info": Information about the MCP server to connect to |
API reference | Tools |
GitHub link | https://github.com/deepset-ai/haystack-core-integrations/tree/main/integrations/mcp |
Overview
MCPTool
is a Tool that allows Haystack to communicate with external tools and services using the Model Context Protocol (MCP). MCP is an open protocol that standardizes how applications provide context to LLMs, similar to how USB-C provides a standardized way to connect devices.
The MCPTool
supports multiple transport options:
- SSE (Server-Sent Events) for connecting to HTTP servers
- StdIO for direct execution of local programs
Learn more about the MCP protocol and its architecture at the official MCP website.
Parameters
name
is mandatory and specifies the name of the tool.server_info
is mandatory and needs to be either anSSEServerInfo
orStdioServerInfo
object that contains connection information.description
is optional and provides context to the LLM about what the tool does.
Usage
Install the MCP-Haystack integration to use the MCPTool
:
pip install mcp-haystack
With SSE Transport
You can create an MCPTool
that connects to an external HTTP server using SSE transport:
from haystack_integrations.components.tools.mcp import MCPTool, SSEServerInfo
# Create an MCP tool that connects to an HTTP server
server_info = SSEServerInfo(base_url="http://localhost:8000")
tool = MCPTool(name="my_tool", server_info=server_info)
# Use the tool
result = tool.invoke(param1="value1", param2="value2")
With StdIO Transport
You can also create an MCPTool
that executes a local program directly and connects to it via stdio transport:
from haystack_integrations.components.tools.mcp import MCPTool, StdioServerInfo
# Create an MCP tool that uses stdio transport
server_info = StdioServerInfo(command="uvx", args=["mcp-server-time", "--local-timezone=Europe/Berlin"])
tool = MCPTool(name="time_tool", server_info=server_info)
# Get the current time in New York
result = tool.invoke(timezone="America/New_York")
In a pipeline
You can integrate an MCPTool
into a pipeline with a ChatGenerator
and a ToolInvoker
:
from haystack import Pipeline
from haystack.components.converters import OutputAdapter
from haystack.components.generators.chat import OpenAIChatGenerator
from haystack.components.tools import ToolInvoker
from haystack.dataclasses import ChatMessage
from haystack_integrations.tools.mcp.mcp_tool import MCPTool, StdioServerInfo
time_tool = MCPTool(
name="get_current_time",
server_info=StdioServerInfo(command="uvx", args=["mcp-server-time", "--local-timezone=Europe/Berlin"]),
)
pipeline = Pipeline()
pipeline.add_component("llm", OpenAIChatGenerator(model="gpt-4o-mini", tools=[time_tool]))
pipeline.add_component("tool_invoker", ToolInvoker(tools=[time_tool]))
pipeline.add_component(
"adapter",
OutputAdapter(
template="{{ initial_msg + initial_tool_messages + tool_messages }}",
output_type=list[ChatMessage],
unsafe=True,
),
)
pipeline.add_component("response_llm", OpenAIChatGenerator(model="gpt-4o-mini"))
pipeline.connect("llm.replies", "tool_invoker.messages")
pipeline.connect("llm.replies", "adapter.initial_tool_messages")
pipeline.connect("tool_invoker.tool_messages", "adapter.tool_messages")
pipeline.connect("adapter.output", "response_llm.messages")
user_input = "What is the time in New York? Be brief." # can be any city
user_input_msg = ChatMessage.from_user(text=user_input)
result = pipeline.run({"llm": {"messages": [user_input_msg]}, "adapter": {"initial_msg": [user_input_msg]}})
print(result["response_llm"]["replies"][0].text)
# The current time in New York is 1:57 PM.
Updated 1 day ago