DocumentationAPI Reference📓 Tutorials🧑‍🍳 Cookbook🤝 Integrations💜 Discord🎨 Studio
API Reference

Extract the output of a Generator to an Answer format, and build prompts.

Module answer_builder

AnswerBuilder

@component
class AnswerBuilder()

Takes a query and the replies a Generator returns as input and parses them into GeneratedAnswer objects. Optionally, it also takes Documents and metadata from the Generator as inputs to enrich the GeneratedAnswer objects.

Usage example:

from haystack.components.builders import AnswerBuilder

builder = AnswerBuilder(pattern="Answer: (.*)")
builder.run(query="What's the answer?", replies=["This is an argument. Answer: This is the answer."])

AnswerBuilder.__init__

def __init__(pattern: Optional[str] = None,
             reference_pattern: Optional[str] = None)

Creates an instance of the AnswerBuilder component.

Arguments:

  • pattern: The regular expression pattern to use to extract the answer text from the generator output. If not specified, the whole string is used as the answer. The regular expression can have at most one capture group. If a capture group is present, the text matched by the capture group is used as the answer. If no capture group is present, the whole match is used as the answer. Examples: [^\n]+$ finds "this is an answer" in a string "this is an argument.\nthis is an answer". Answer: (.*) finds "this is an answer" in a string "this is an argument. Answer: this is an answer".
  • reference_pattern: The regular expression pattern to use for parsing the document references. We assume that references are specified as indices of the input documents and that indices start at 1. Example: \[(\d+)\] finds "1" in a string "this is an answer[1]". If not specified, no parsing is done, and all documents are referenced.

AnswerBuilder.run

@component.output_types(answers=List[GeneratedAnswer])
def run(query: str,
        replies: List[str],
        meta: Optional[List[Dict[str, Any]]] = None,
        documents: Optional[List[Document]] = None,
        pattern: Optional[str] = None,
        reference_pattern: Optional[str] = None)

Turns the output of a Generator into Answer objects using regular expressions.

Arguments:

  • query: The query used in the prompts for the Generator.
  • replies: The output of the Generator.
  • meta: The metadata returned by the Generator. If not specified, the generated answer will contain no metadata.
  • documents: The documents used as input to the Generator. If documents are specified, they are added to the Answer objects. If both documents and reference_pattern are specified, the documents referenced in the Generator output are extracted from the input documents and added to the Answer objects.
  • pattern: The regular expression pattern to use to extract the answer text from the generator output. If not specified, the whole string is used as the answer. The regular expression can have at most one capture group. If a capture group is present, the text matched by the capture group is used as the answer. If no capture group is present, the whole match is used as the answer. Examples: [^\n]+$ finds "this is an answer" in a string "this is an argument.\nthis is an answer". Answer: (.*) finds "this is an answer" in a string "this is an argument. Answer: this is an answer".
  • reference_pattern: The regular expression pattern to use for parsing the document references. We assume that references are specified as indices of the input documents and that indices start at 1. Example: \[(\d+)\] finds "1" in a string "this is an answer[1]". If not specified, no parsing is done, and all documents are referenced.

Returns:

A dictionary with the following keys:

  • answers: The answers obtained from the output of the generator

Module prompt_builder

PromptBuilder

@component
class PromptBuilder()

PromptBuilder is a component that renders a prompt from a template string using Jinja2 templates. The template variables found in the template string are used as input types for the component and are all required.

Usage example:

template = "Translate the following context to {{ target_language }}. Context: {{ snippet }}; Translation:"
builder = PromptBuilder(template=template)
builder.run(target_language="spanish", snippet="I can't speak spanish.")

PromptBuilder.__init__

def __init__(template: str)

Constructs a PromptBuilder component.

Arguments:

  • template: A Jinja2 template string, e.g. "Summarize this document: {documents}\nSummary:"

PromptBuilder.run

@component.output_types(prompt=str)
def run(**kwargs)

Arguments:

  • kwargs: The variables that will be used to render the prompt template.

Returns:

A dictionary with the following keys:

  • prompt: The updated prompt text after rendering the prompt template.

Module dynamic_prompt_builder

DynamicPromptBuilder

@component
class DynamicPromptBuilder()

DynamicPromptBuilder is designed to construct dynamic prompts for the pipeline. Users can change the prompt template at runtime by providing a new template for each pipeline run invocation if needed.

Usage example:

from typing import List
from haystack.components.builders import DynamicPromptBuilder
from haystack.components.generators import OpenAIGenerator
from haystack import Pipeline, component, Document
from haystack.utils import Secret

prompt_builder = DynamicPromptBuilder(runtime_variables=["documents"])
llm = OpenAIGenerator(api_key=Secret.from_token("<your-api-key>"), model="gpt-3.5-turbo")


@component
class DocumentProducer:

    @component.output_types(documents=List[Document])
    def run(self, doc_input: str):
        return {"documents": [Document(content=doc_input)]}


pipe = Pipeline()
pipe.add_component("doc_producer", DocumentProducer())
pipe.add_component("prompt_builder", prompt_builder)
pipe.add_component("llm", llm)
pipe.connect("doc_producer.documents", "prompt_builder.documents")
pipe.connect("prompt_builder.prompt", "llm.prompt")

template = "Here is the document: {{documents[0].content}} \n Answer: {{query}}"
result = pipe.run(
    data={
        "doc_producer": {"doc_input": "Hello world, I live in Berlin"},
        "prompt_builder": {
            "prompt_source": template,
            "template_variables": {"query": "Where does the speaker live?"},
        },
    }
)
print(result)

>> {'llm': {'replies': ['The speaker lives in Berlin.'],
>> 'meta': [{'model': 'gpt-3.5-turbo-0613',
>> 'index': 0,
>> 'finish_reason': 'stop',
>> 'usage': {'prompt_tokens': 28,
>> 'completion_tokens': 6,
>> 'total_tokens': 34}}]}}

Note how in the example above, we can dynamically change the prompt template by providing a new template to the
run method of the pipeline. This dynamic prompt generation is in contrast to the static prompt generation
using `PromptBuilder`, where the prompt template is fixed for the pipeline's lifetime and cannot be changed
for each pipeline run invocation.

<a id="dynamic_prompt_builder.DynamicPromptBuilder.__init__"></a>

#### DynamicPromptBuilder.\_\_init\_\_

```python
def __init__(runtime_variables: Optional[List[str]] = None)

Constructs a DynamicPromptBuilder component.

Arguments:

  • runtime_variables: A list of template variable names you can use in prompt construction. For example, if runtime_variables contains the string documents, the component will create an input called documents of type Any. These variable names are used to resolve variables and their values during pipeline execution. The values associated with variables from the pipeline runtime are then injected into template placeholders of a prompt text template that is provided to the run method.

DynamicPromptBuilder.run

def run(prompt_source: str,
        template_variables: Optional[Dict[str, Any]] = None,
        **kwargs)

Executes the dynamic prompt building process. Depending on the provided type of prompt_source, this method

either processes a list of ChatMessage instances or a string template. In the case of ChatMessage instances, the last user message is treated as a template and rendered with the resolved pipeline variables and any additional template variables provided. For a string template, it directly applies the template variables to render the final prompt. You can provide additional template variables directly to this method, that are then merged with the variables resolved from the pipeline runtime.

Arguments:

  • prompt_source: A string template.
  • template_variables: An optional dictionary of template variables. Template variables provided at initialization are required to resolve pipeline variables, and these are additional variables users can provide directly to this method.
  • kwargs: Additional keyword arguments, typically resolved from a pipeline, which are merged with the provided template variables.

Returns:

A dictionary with the following keys:

  • prompt: The updated prompt text after rendering the string template.

Module dynamic_chat_prompt_builder

DynamicChatPromptBuilder

@component
class DynamicChatPromptBuilder()

DynamicChatPromptBuilder is designed to construct dynamic prompts from a list of ChatMessage instances. It integrates with Jinja2 templating for dynamic prompt generation. It assumes that the last user message in the list contains a template and renders it with variables provided to the constructor. Additional template variables can be feed into the pipeline run method and will be merged before rendering the template.

Usage example:
```python
from haystack.components.builders import DynamicChatPromptBuilder
from haystack.components.generators.chat import OpenAIChatGenerator
from haystack.dataclasses import ChatMessage
from haystack import Pipeline
from haystack.utils import Secret

# no parameter init, we don't use any runtime template variables
prompt_builder = DynamicChatPromptBuilder()
llm = OpenAIChatGenerator(api_key=Secret.from_token("<your-api-key>"), model="gpt-3.5-turbo")

pipe = Pipeline()
pipe.add_component("prompt_builder", prompt_builder)
pipe.add_component("llm", llm)
pipe.connect("prompt_builder.prompt", "llm.messages")

location = "Berlin"
system_message = ChatMessage.from_system("You are a helpful assistant giving out valuable information to tourists.")
messages = [system_message, ChatMessage.from_user("Tell me about {{location}}")]


res = pipe.run(data={"prompt_builder": {"template_variables": {"location": location}, "prompt_source": messages}})
print(res)

>> {'llm': {'replies': [ChatMessage(content="Berlin is the capital city of Germany and one of the most vibrant
and diverse cities in Europe. Here are some key things to know...Enjoy your time exploring the vibrant and dynamic
capital of Germany!", role=<ChatRole.ASSISTANT: 'assistant'>, name=None, meta={'model': 'gpt-3.5-turbo-0613',
'index': 0, 'finish_reason': 'stop', 'usage': {'prompt_tokens': 27, 'completion_tokens': 681, 'total_tokens':
708}})]}}


messages = [system_message, ChatMessage.from_user("What's the weather forecast for {{location}} in the next
{{day_count}} days?")]

res = pipe.run(data={"prompt_builder": {"template_variables": {"location": location, "day_count": "5"},
                                    "prompt_source": messages}})

print(res)
>> {'llm': {'replies': [ChatMessage(content="Here is the weather forecast for Berlin in the next 5
days:

Day 1: Mostly cloudy with a high of 22°C (72°F) and...so it's always a good idea to check for updates closer to your visit.", role=<ChatRole.ASSISTANT: 'assistant'>, name=None, meta={'model': 'gpt-3.5-turbo-0613', 'index': 0, 'finish_reason': 'stop', 'usage': {'prompt_tokens': 37, 'completion_tokens': 201, 'total_tokens': 238}})]}} ```

Note that the weather forecast in the example above is fictional, but it can be easily connected to a weather
API to provide real weather forecasts.

DynamicChatPromptBuilder.__init__

def __init__(runtime_variables: Optional[List[str]] = None)

Constructs a DynamicChatPromptBuilder component.

Arguments:

  • runtime_variables: A list of template variable names you can use in chat prompt construction. For example, if runtime_variables contains the string documents, the component will create an input called documents of type Any. These variable names are used to resolve variables and their values during pipeline execution. The values associated with variables from the pipeline runtime are then injected into template placeholders of a ChatMessage that is provided to the run method.

DynamicChatPromptBuilder.run

def run(prompt_source: List[ChatMessage],
        template_variables: Optional[Dict[str, Any]] = None,
        **kwargs)

Executes the dynamic prompt building process by processing a list of ChatMessage instances.

The last user message is treated as a template and rendered with the variables provided to the constructor. You can provide additional template variables directly to this method, which are then merged with the variables provided to the constructor.

Arguments:

  • prompt_source: A list of ChatMessage instances. We make an assumption that the last user message has the template for the chat prompt
  • template_variables: A dictionary of template variables. Template variables provided at initialization are required to resolve pipeline variables, and these are additional variables users can provide directly to this method.
  • kwargs: Additional keyword arguments, typically resolved from a pipeline, which are merged with the provided template variables.

Returns:

A dictionary with the following keys:

  • prompt: The updated list of ChatMessage instances after rendering the string template.