Skip to main content
Version: 2.28-unstable

OpenAPIServiceToFunctions

OpenAPIServiceToFunctions is a component that transforms OpenAPI service specifications into a format compatible with LLM tool calling.

Most common position in a pipelineFlexible
Mandatory run variablessources: A list of OpenAPI specification sources, which can be file paths or ByteStream objects
Output variablesfunctions: A list of JSON function definitions objects. For each path definition in OpenAPI specification, a corresponding function definition is generated.

openapi_specs: A list of JSON/YAML objects with references resolved. Such OpenAPI spec (with references resolved) can, in turn, be used as input to OpenAPIServiceConnector.
API referenceConverters
GitHub linkhttps://github.com/deepset-ai/haystack/blob/main/haystack/components/converters/openapi_functions.py

Overview

OpenAPIServiceToFunctions transforms OpenAPI service specifications into a function calling format suitable for LLM tool calling. It takes an OpenAPI specification, processes it to extract function definitions, and formats these definitions to be compatible with LLM tool calling.

OpenAPIServiceToFunctions is valuable when used together with OpenAPIServiceConnector component. It converts OpenAPI specifications into function definitions, allowing OpenAPIServiceConnector to handle input parameters for the OpenAPI specification and facilitate their use in REST API calls through OpenAPIServiceConnector.

To use OpenAPIServiceToFunctions, you need to install an optional jsonref dependency with:

shell
pip install jsonref

OpenAPIServiceToFunctions component doesn’t have any init parameters.

Usage

On its own

This component is primarily meant to be used in pipelines. Using this component alone is useful when you want to convert OpenAPI specification into function definitions and then perhaps save them in a file and subsequently use them for tool calling.

In a pipeline

In a pipeline context, OpenAPIServiceToFunctions is most valuable when used alongside OpenAPIServiceConnector. For instance, let’s consider integrating serper.dev search engine bridge into a pipeline. OpenAPIServiceToFunctions retrieves the OpenAPI specification of Serper from https://bit.ly/serper_dev_spec, converts this specification into function definitions that an LLM with tool calling capabilities can understand, and then seamlessly passes these definitions as generation_kwargs to the Chat Generator component.

info

To run the following code snippet, note that you have to have your own Serper and OpenAI API keys.

python
import json
import requests

from typing import Any

from haystack import Pipeline
from haystack.components.connectors import OpenAPIServiceConnector
from haystack.components.converters import OpenAPIServiceToFunctions, OutputAdapter
from haystack.components.generators.chat import OpenAIChatGenerator
from haystack.dataclasses import ChatMessage
from haystack.dataclasses.byte_stream import ByteStream


def prepare_fc_params(openai_functions_schema: dict[str, Any]) -> dict[str, Any]:
return {
"tools": [{"type": "function", "function": openai_functions_schema}],
"tool_choice": {
"type": "function",
"function": {"name": openai_functions_schema["name"]},
},
}


serperdev_spec = requests.get("https://bit.ly/serper_dev_spec").json()
system_prompt = requests.get("https://bit.ly/serper_dev_system").text
user_prompt = "Why was Sam Altman ousted from OpenAI?"

pipe = Pipeline()
pipe.add_component("spec_to_functions", OpenAPIServiceToFunctions())
pipe.add_component(
"prepare_fc_adapter",
OutputAdapter(
"{{functions[0] | prepare_fc}}",
dict[str, Any],
{"prepare_fc": prepare_fc_params},
),
)
pipe.add_component("functions_llm", OpenAIChatGenerator())
pipe.add_component("openapi_connector", OpenAPIServiceConnector())
pipe.add_component(
"message_adapter",
OutputAdapter(
"{{system_message + service_response}}",
list[ChatMessage],
unsafe=True,
),
)
pipe.add_component("llm", OpenAIChatGenerator())

pipe.connect("spec_to_functions.functions", "prepare_fc_adapter.functions")
pipe.connect(
"spec_to_functions.openapi_specs",
"openapi_connector.service_openapi_spec",
)
pipe.connect("prepare_fc_adapter", "functions_llm.generation_kwargs")
pipe.connect("functions_llm.replies", "openapi_connector.messages")
pipe.connect("openapi_connector.service_response", "message_adapter.service_response")
pipe.connect("message_adapter", "llm.messages")

result = pipe.run(
data={
"functions_llm": {
"messages": [
ChatMessage.from_system("Only do tool/function calling"),
ChatMessage.from_user(user_prompt),
],
},
"openapi_connector": {
"service_credentials": serper_dev_key,
},
"spec_to_functions": {
"sources": [ByteStream.from_string(json.dumps(serperdev_spec))],
},
"message_adapter": {
"system_message": [ChatMessage.from_system(system_prompt)],
},
},
)

print(result["llm"]["replies"][0].text)

# Sam Altman was ousted from OpenAI on November 17, 2023, following
# a "deliberative review process" by the board of directors. The board concluded
# that he was not "consistently candid in his communications". However, he
# returned as CEO just days after his ouster.