DocumentationAPI Reference📓 Tutorials🧑‍🍳 Cookbook🤝 Integrations💜 Discord🎨 Studio (Waitlist)
API Reference

Haystack experimental tools.

Module haystack_experimental.components.tools.tool_invoker

ToolNotFoundException

Exception raised when a tool is not found in the list of available tools.

StringConversionError

Exception raised when the conversion of a tool result to a string fails.

ToolInvoker

Invokes tools based on prepared tool calls and returns the results as a list of ChatMessage objects.

At initialization, the ToolInvoker component is provided with a list of available tools. At runtime, the component processes a list of ChatMessage object containing tool calls and invokes the corresponding tools. The results of the tool invocations are returned as a list of ChatMessage objects with tool role.

Usage example:

from haystack_experimental.dataclasses import ChatMessage, ToolCall, Tool
from haystack_experimental.components.tools import ToolInvoker

# Tool definition
def dummy_weather_function(city: str):
    return f"The weather in {city} is 20 degrees."

parameters = {"type": "object",
            "properties": {"city": {"type": "string"}},
            "required": ["city"]}

tool = Tool(name="weather_tool",
            description="A tool to get the weather",
            function=dummy_weather_function,
            parameters=parameters)

# Usually, the ChatMessage with tool_calls is generated by a Language Model
# Here, we create it manually for demonstration purposes
tool_call = ToolCall(
    tool_name="weather_tool",
    arguments={"city": "Berlin"}
)
message = ChatMessage.from_assistant(tool_calls=[tool_call])

# ToolInvoker initialization and run
invoker = ToolInvoker(tools=[tool])
result = invoker.run(messages=[message])

print(result)
>>  {
>>      'tool_messages': [
>>          ChatMessage(
>>              _role=<ChatRole.TOOL: 'tool'>,
>>              _content=[
>>                  ToolCallResult(
>>                      result='"The weather in Berlin is 20 degrees."',
>>                      origin=ToolCall(
>>                          tool_name='weather_tool',
>>                          arguments={'city': 'Berlin'},
>>                          id=None
>>                      )
>>                  )
>>              ],
>>              _meta={}
>>          )
>>      ]
>>  }

ToolInvoker.__init__

def __init__(tools: List[Tool],
             raise_on_failure: bool = True,
             convert_result_to_json_string: bool = False)

Initialize the ToolInvoker component.

Arguments:

  • tools: A list of tools that can be invoked.
  • raise_on_failure: If True, the component will raise an exception in case of errors (tool not found, tool invocation errors, tool result conversion errors). If False, the component will return a ChatMessage object with error=True and a description of the error in result.
  • convert_result_to_json_string: If True, the tool invocation result will be converted to a string using json.dumps. If False, the tool invocation result will be converted to a string using str.

ToolInvoker.run

@component.output_types(tool_messages=List[ChatMessage])
def run(messages: List[ChatMessage]) -> Dict[str, Any]

Processes ChatMessage objects containing tool calls and invokes the corresponding tools, if available.

Arguments:

  • messages: A list of ChatMessage objects.

Raises:

  • ToolNotFoundException: If the tool is not found in the list of available tools and raise_on_failure is True.
  • ToolInvocationError: If the tool invocation fails and raise_on_failure is True.
  • StringConversionError: If the conversion of the tool result to a string fails and raise_on_failure is True.

Returns:

A dictionary with the key tool_messages containing a list of ChatMessage objects with tool role. Each ChatMessage objects wraps the result of a tool invocation.

ToolInvoker.to_dict

def to_dict() -> Dict[str, Any]

Serializes the component to a dictionary.

Returns:

Dictionary with serialized data.

ToolInvoker.from_dict

@classmethod
def from_dict(cls, data: Dict[str, Any]) -> "ToolInvoker"

Deserializes the component from a dictionary.

Arguments:

  • data: The dictionary to deserialize from.

Returns:

The deserialized component.

Module haystack_experimental.components.tools.openai.function_caller

OpenAIFunctionCaller

OpenAIFunctionCaller processes a list of chat messages and call Python functions when needed.

The OpenAIFunctionCaller expects a list of ChatMessages and if there is a tool call with a function name and arguments, it runs the function and returns the result as a ChatMessage from role = 'function'

OpenAIFunctionCaller.__init__

def __init__(available_functions: Dict[str, Callable])

Initialize the OpenAIFunctionCaller component.

Arguments:

  • available_functions: A dictionary of available functions. This dictionary expects key value pairs of function name, and the function itself. For example, {"weather_function": weather_function}

OpenAIFunctionCaller.to_dict

def to_dict() -> Dict[str, Any]

Serializes the component to a dictionary.

Returns:

Dictionary with serialized data.

OpenAIFunctionCaller.from_dict

@classmethod
def from_dict(cls, data: Dict[str, Any]) -> "OpenAIFunctionCaller"

Deserializes the component from a dictionary.

Arguments:

  • data: The dictionary to deserialize from.

Returns:

The deserialized component.

OpenAIFunctionCaller.run

@component.output_types(function_replies=List[ChatMessage],
                        assistant_replies=List[ChatMessage])
def run(messages: List[ChatMessage])

Evaluates messages and invokes available functions if the messages contain tool_calls.

Arguments:

  • messages: A list of messages generated from the OpenAIChatGenerator

Returns:

This component returns a list of messages in one of two outputs

  • function_replies: List of ChatMessages containing the result of a function invocation. This message comes from role = 'function'. If the function name was hallucinated or wrong, an assistant message explaining as such is returned
  • assistant_replies: List of ChatMessages containing a regular assistant reply. In this case, there were no tool_calls in the received messages

Module haystack_experimental.components.tools.openapi.openapi_tool

OpenAPITool

The OpenAPITool calls a RESTful endpoint of an OpenAPI service using payloads generated from human instructions.

Here is an example of how to use the OpenAPITool component to scrape a URL using the FireCrawl API:

from haystack.dataclasses import ChatMessage
from haystack_experimental.components.tools.openapi import OpenAPITool, LLMProvider
from haystack.utils import Secret

tool = OpenAPITool(generator_api=LLMProvider.OPENAI,
                   generator_api_params={"model":"gpt-4o-mini"},
                   spec="https://raw.githubusercontent.com/mendableai/firecrawl/main/apps/api/openapi.json",
                   credentials=Secret.from_env_var("FIRECRAWL_API_KEY"))

results = tool.run(messages=[ChatMessage.from_user("Scrape URL: https://news.ycombinator.com/")])
print(results)

Similarly, you can use the OpenAPITool component to invoke any OpenAPI service/tool by providing the OpenAPI specification and credentials.

OpenAPITool.__init__

def __init__(generator_api: LLMProvider,
             generator_api_params: Optional[Dict[str, Any]] = None,
             spec: Optional[Union[str, Path]] = None,
             credentials: Optional[Secret] = None,
             allowed_operations: Optional[List[str]] = None)

Initialize the OpenAPITool component.

Arguments:

  • generator_api: The API provider for the chat generator.
  • generator_api_params: Parameters to pass for the chat generator creation.
  • spec: OpenAPI specification for the tool/service. This can be a URL, a local file path, or an OpenAPI service specification provided as a string.
  • credentials: Credentials for the tool/service.
  • allowed_operations: A list of operations to register with LLMs via the LLM tools parameter. Use operationId field in the OpenAPI spec path/operation to specify the operation names to use. If not specified, all operations found in the OpenAPI spec will be registered with LLMs.

OpenAPITool.run

@component.output_types(service_response=List[ChatMessage])
def run(messages: List[ChatMessage],
        fc_generator_kwargs: Optional[Dict[str, Any]] = None,
        spec: Optional[Union[str, Path]] = None,
        credentials: Optional[Secret] = None) -> Dict[str, List[ChatMessage]]

Invokes the underlying OpenAPI service/tool with the function calling payload generated by the chat generator.

Arguments:

  • messages: List of ChatMessages to generate function calling payload (e.g. human instructions). The last message should be human instruction containing enough information to generate the function calling payload suitable for the OpenAPI service/tool used. See the examples in the class docstring.
  • fc_generator_kwargs: Additional arguments for the function calling payload generation process.
  • spec: OpenAPI specification for the tool/service, overrides the one provided at initialization.
  • credentials: Credentials for the tool/service, overrides the one provided at initialization.

Returns:

a dictionary containing the service response with the following key:

  • service_response: List of ChatMessages containing the service response. ChatMessages are generated based on the response from the OpenAPI service/tool and contains the JSON response from the service. If there is an error during the invocation, the response will be a ChatMessage with the error message under the error key.

OpenAPITool.to_dict

def to_dict() -> Dict[str, Any]

Serialize this component to a dictionary.

Returns:

The serialized component as a dictionary.

OpenAPITool.from_dict

@classmethod
def from_dict(cls, data: Dict[str, Any]) -> "OpenAPITool"

Deserialize this component from a dictionary.

Arguments:

  • data: The dictionary representation of this component.

Returns:

The deserialized component instance.