Toolset
Group multiple Tools into a single unit.
| Mandatory init variables | tools: A list of tools |
| API reference | Toolset |
| GitHub link | https://github.com/deepset-ai/haystack/blob/main/haystack/tools/toolset.py |
| Package name | haystack-ai |
Overviewβ
A Toolset groups multiple Tool instances into a single manageable unit. It simplifies passing tools to components like Chat Generators,Β ToolInvoker, or Agent, and supports filtering, serialization, and reuse.
Additionally, by subclassing Toolset, you can create implementations that dynamically load tools from external sources like OpenAPI URLs, MCP servers, or other resources.
Initializing Toolsetβ
Hereβs how to initialize Toolset with Tool. Alternatively, you can use ComponentTool or MCPTool in Toolset as Tool instances.
from typing import Annotated
from haystack.tools import Toolset, tool
@tool
def add_numbers(
a: Annotated[int, "first number"],
b: Annotated[int, "second number"],
) -> int:
"""Add two numbers."""
return a + b
@tool
def subtract_numbers(
a: Annotated[int, "first number"],
b: Annotated[int, "second number"],
) -> int:
"""Subtract b from a."""
return a - b
math_toolset = Toolset([add_numbers, subtract_numbers])
Adding New Tools to Toolsetβ
from typing import Annotated
from haystack.tools import tool
@tool
def multiply_numbers(
a: Annotated[int, "first number"],
b: Annotated[int, "second number"],
) -> int:
"""Multiply two numbers."""
return a * b
math_toolset.add(multiply_numbers)
# or, you can merge toolsets together
math_toolset.add(another_toolset)
Usageβ
You can use Toolset wherever you can use Tools in Haystack.
The recommended way to use a Toolset in Haystack is with the Agent component, which manages the tool call loop for you. The examples below also show how to wire ChatGenerator and ToolInvoker together manually for cases where you need fine-grained control.
With the Agentβ
from haystack.components.agents import Agent
from haystack.dataclasses import ChatMessage
from haystack.components.generators.chat import OpenAIChatGenerator
agent = Agent(
system_prompt="You are a helpful assistant that can do math using the tools at your disposal.",
chat_generator=OpenAIChatGenerator(model="gpt-5.4-nano"),
tools=math_toolset,
)
response = agent.run(messages=[ChatMessage.from_user("What is 4 + 2?")])
print(response["messages"][-1].text)
Output:
With ChatGenerator and ToolInvokerβ
from haystack.components.generators.chat import OpenAIChatGenerator
from haystack.components.tools import ToolInvoker
from haystack.dataclasses import ChatMessage
chat_generator = OpenAIChatGenerator(model="gpt-5.4-nano", tools=math_toolset)
tool_invoker = ToolInvoker(tools=math_toolset)
user_message = ChatMessage.from_user("What is 10 minus 5?")
replies = chat_generator.run(messages=[user_message])["replies"]
print(f"assistant message: {replies}")
# If the assistant message contains a tool call, run the tool invoker
if replies[0].tool_calls:
tool_messages = tool_invoker.run(messages=replies)["tool_messages"]
print(f"tool result: {tool_messages[0].tool_call_result.result}")
Output:
assistant message: [ChatMessage(
_role=<ChatRole.ASSISTANT: 'assistant'>,
_content=[ToolCall(tool_name='subtract', arguments={'a': 10, 'b': 5}, id='call_awGa5q7KtQ9BrMGPTj6IgEH1')],
_meta={'model': 'gpt-5.4-nano', 'index': 0, 'finish_reason': 'tool_calls', 'usage': {'completion_tokens': 18, 'prompt_tokens': 75, 'total_tokens': 93}}
)]
tool result: 5
In a Pipelineβ
from haystack import Pipeline
from haystack.components.converters import OutputAdapter
from haystack.components.generators.chat import OpenAIChatGenerator
from haystack.components.tools import ToolInvoker
from haystack.dataclasses import ChatMessage
pipeline = Pipeline()
pipeline.add_component(
"llm",
OpenAIChatGenerator(model="gpt-5.4-nano", tools=math_toolset),
)
pipeline.add_component("tool_invoker", ToolInvoker(tools=math_toolset))
pipeline.add_component(
"adapter",
OutputAdapter(
template="{{ initial_msg + initial_tool_messages + tool_messages }}",
output_type=list[ChatMessage],
unsafe=True,
),
)
pipeline.add_component("response_llm", OpenAIChatGenerator(model="gpt-5.4-nano"))
pipeline.connect("llm.replies", "tool_invoker.messages")
pipeline.connect("llm.replies", "adapter.initial_tool_messages")
pipeline.connect("tool_invoker.tool_messages", "adapter.tool_messages")
pipeline.connect("adapter.output", "response_llm.messages")
user_input_msg = ChatMessage.from_user(text="What is 2+2?")
result = pipeline.run(
{
"llm": {"messages": [user_input_msg]},
"adapter": {"initial_msg": [user_input_msg]},
},
)
print(result["response_llm"]["replies"][0].text)
Output: