Various connectors to integrate with external services.
Module openapi_service
OpenAPIServiceConnector
@component
class OpenAPIServiceConnector()
The OpenAPIServiceConnector
component connects the Haystack framework to OpenAPI services, enabling it to call
operations as defined in the OpenAPI specification of the service.
It integrates with ChatMessage
dataclass, where the payload in messages is used to determine the method to be
called and the parameters to be passed. The message payload should be an OpenAI JSON formatted function calling
string consisting of the method name and the parameters to be passed to the method. The method name and parameters
are then used to invoke the method on the OpenAPI service. The response from the service is returned as a
ChatMessage
.
Before using this component, users usually resolve service endpoint parameters with a help of
OpenAPIServiceToFunctions
component.
The example below demonstrates how to use the OpenAPIServiceConnector
to invoke a method on a https://serper.dev/
service specified via OpenAPI specification.
Note, however, that OpenAPIServiceConnector
is usually not meant to be used directly, but rather as part of a
pipeline that includes the OpenAPIServiceToFunctions
component and an OpenAIChatGenerator
component using LLM
with the function calling capabilities. In the example below we use the function calling payload directly, but in a
real-world scenario, the function calling payload would usually be generated by the OpenAIChatGenerator
component.
Usage example:
import json
import requests
from haystack.components.connectors import OpenAPIServiceConnector
from haystack.dataclasses import ChatMessage
fc_payload = [{'function': {'arguments': '{"q": "Why was Sam Altman ousted from OpenAI?"}', 'name': 'search'},
'id': 'call_PmEBYvZ7mGrQP5PUASA5m9wO', 'type': 'function'}]
serper_token = <your_serper_dev_token>
serperdev_openapi_spec = json.loads(requests.get("https://bit.ly/serper_dev_spec").text)
service_connector = OpenAPIServiceConnector()
result = service_connector.run(messages=[ChatMessage.from_assistant(json.dumps(fc_payload))],
service_openapi_spec=serperdev_openapi_spec, service_credentials=serper_token)
print(result)
>> {'service_response': [ChatMessage(content='{"searchParameters": {"q": "Why was Sam Altman ousted from OpenAI?",
>> "type": "search", "engine": "google"}, "answerBox": {"snippet": "Concerns over AI safety and OpenAI's role
>> in protecting were at the center of Altman's brief ouster from the company."...
OpenAPIServiceConnector.__init__
def __init__()
Initializes the OpenAPIServiceConnector instance
OpenAPIServiceConnector.run
@component.output_types(service_response=Dict[str, Any])
def run(
messages: List[ChatMessage],
service_openapi_spec: Dict[str, Any],
service_credentials: Optional[Union[dict, str]] = None
) -> Dict[str, List[ChatMessage]]
Processes a list of chat messages to invoke a method on an OpenAPI service. It parses the last message in the
list, expecting it to contain an OpenAI function calling descriptor (name & parameters) in JSON format.
Arguments:
messages
: A list ofChatMessage
objects containing the messages to be processed. The last message should contain the function invocation payload in OpenAI function calling format. See the example in the class docstring for the expected format.service_openapi_spec
: The OpenAPI JSON specification object of the service to be invoked. All the refs should already be resolved.service_credentials
: The credentials to be used for authentication with the service. Currently, only the http and apiKey OpenAPI security schemes are supported.
Raises:
ValueError
: If the last message is not from the assistant or if it does not contain the correct payload to invoke a method on the service.
Returns:
A dictionary with the following keys:
service_response
: a list ofChatMessage
objects, each containing the response from the service. The response is in JSON format, and thecontent
attribute of theChatMessage
contains the JSON string.