Extract the output of a Generator to an Answer format, and build prompts.
Module answer_builder
AnswerBuilder
Takes a query and the replies a Generator returns as input and parses them into GeneratedAnswer objects.
Optionally, it also takes Documents and metadata from the Generator as inputs to enrich the GeneratedAnswer objects.
Usage example:
from haystack.components.builders import AnswerBuilder
builder = AnswerBuilder(pattern="Answer: (.*)")
builder.run(query="What's the answer?", replies=["This is an argument. Answer: This is the answer."])
AnswerBuilder.__init__
def __init__(pattern: Optional[str] = None,
reference_pattern: Optional[str] = None)
Creates an instance of the AnswerBuilder component.
Arguments:
pattern
: The regular expression pattern to use to extract the answer text from the generator output. If not specified, the whole string is used as the answer. The regular expression can have at most one capture group. If a capture group is present, the text matched by the capture group is used as the answer. If no capture group is present, the whole match is used as the answer. Examples:[^\n]+$
finds "this is an answer" in a string "this is an argument.\nthis is an answer".Answer: (.*)
finds "this is an answer" in a string "this is an argument. Answer: this is an answer".reference_pattern
: The regular expression pattern to use for parsing the document references. We assume that references are specified as indices of the input documents and that indices start at 1. Example:\[(\d+)\]
finds "1" in a string "this is an answer[1]". If not specified, no parsing is done, and all documents are referenced.
AnswerBuilder.run
@component.output_types(answers=List[GeneratedAnswer])
def run(query: str,
replies: List[str],
meta: Optional[List[Dict[str, Any]]] = None,
documents: Optional[List[Document]] = None,
pattern: Optional[str] = None,
reference_pattern: Optional[str] = None)
Turns the output of a Generator into Answer
objects using regular expressions.
Arguments:
query
: The query used in the prompts for the Generator.replies
: The output of the Generator.meta
: The metadata returned by the Generator. If not specified, the generated answer will contain no metadata.documents
: The documents used as input to the Generator. Ifdocuments
are specified, they are added to theAnswer
objects. If bothdocuments
andreference_pattern
are specified, the documents referenced in the Generator output are extracted from the input documents and added to theAnswer
objects.pattern
: The regular expression pattern to use to extract the answer text from the generator output. If not specified, the whole string is used as the answer. The regular expression can have at most one capture group. If a capture group is present, the text matched by the capture group is used as the answer. If no capture group is present, the whole match is used as the answer. Examples:[^\n]+$
finds "this is an answer" in a string "this is an argument.\nthis is an answer".Answer: (.*)
finds "this is an answer" in a string "this is an argument. Answer: this is an answer".reference_pattern
: The regular expression pattern to use for parsing the document references. We assume that references are specified as indices of the input documents and that indices start at 1. Example:\[(\d+)\]
finds "1" in a string "this is an answer[1]". If not specified, no parsing is done, and all documents are referenced.
Returns:
A dictionary with the following keys:
answers
: The answers obtained from the output of the generator
Module prompt_builder
PromptBuilder
PromptBuilder is a component that renders a prompt from a template string using Jinja2 templates.
For prompt engineering, users can switch the template at runtime by providing a template for each pipeline run invocation.
The template variables found in the default template string are used as input types for the component and are all optional,
unless explicitly specified. If an optional template variable is not provided as an input, it will be replaced with
an empty string in the rendered prompt. Use variables
and required_variables
to change the default variable behavior.
Usage examples
On its own
Below is an example of using the PromptBuilder
to render a prompt template and fill it with target_language
and snippet
.
The PromptBuilder returns a prompt with the string "Translate the following context to spanish. Context: I can't speak spanish.; Translation:".
from haystack.components.builders import PromptBuilder
template = "Translate the following context to {{ target_language }}. Context: {{ snippet }}; Translation:"
builder = PromptBuilder(template=template)
builder.run(target_language="spanish", snippet="I can't speak spanish.")
In a Pipeline
Below is an example of a RAG pipeline where we use a PromptBuilder
to render a custom prompt template and fill it with the
contents of retrieved Documents and a query. The rendered prompt is then sent to a Generator.
from haystack import Pipeline, Document
from haystack.utils import Secret
from haystack.components.generators import OpenAIGenerator
from haystack.components.builders.prompt_builder import PromptBuilder
# in a real world use case documents could come from a retriever, web, or any other source
documents = [Document(content="Joe lives in Berlin"), Document(content="Joe is a software engineer")]
prompt_template = """
Given these documents, answer the question.
Documents:
{% for doc in documents %}
{{ doc.content }}
{% endfor %}
Question: {{query}}
Answer:
"""
p = Pipeline()
p.add_component(instance=PromptBuilder(template=prompt_template), name="prompt_builder")
p.add_component(instance=OpenAIGenerator(api_key=Secret.from_env_var("OPENAI_API_KEY")), name="llm")
p.connect("prompt_builder", "llm")
question = "Where does Joe live?"
result = p.run({"prompt_builder": {"documents": documents, "query": question}})
print(result)
Changing the template at runtime (Prompt Engineering)
PromptBuilder
allows you to switch the prompt template of an existing pipeline.
Below's example builds on top of the existing pipeline of the previous section.
The existing pipeline is invoked with a new prompt template:
documents = [
Document(content="Joe lives in Berlin", meta={"name": "doc1"}),
Document(content="Joe is a software engineer", meta={"name": "doc1"}),
]
new_template = """
You are a helpful assistant.
Given these documents, answer the question.
Documents:
{% for doc in documents %}
Document {{ loop.index }}:
Document name: {{ doc.meta['name'] }}
{{ doc.content }}
{% endfor %}
Question: {{ query }}
Answer:
"""
p.run({
"prompt_builder": {
"documents": documents,
"query": question,
"template": new_template,
},
})
If you want to use different variables during prompt engineering than in the default template,
you can do so by setting PromptBuilder
's variables
init parameter accordingly.
Overwriting variables at runtime
In case you want to overwrite the values of variables, you can use template_variables
during runtime as illustrated below:
language_template = """
You are a helpful assistant.
Given these documents, answer the question.
Documents:
{% for doc in documents %}
Document {{ loop.index }}:
Document name: {{ doc.meta['name'] }}
{{ doc.content }}
{% endfor %}
Question: {{ query }}
Please provide your answer in {{ answer_language | default('English') }}
Answer:
"""
p.run({
"prompt_builder": {
"documents": documents,
"query": question,
"template": language_template,
"template_variables": {"answer_language": "German"},
},
})
Note that language_template
introduces variable answer_language
which is not bound to any pipeline variable.
If not set otherwise, it would evaluate to its default value 'English'.
In this example we are overwriting its value to 'German'.
template_variables
allows you to overwrite pipeline variables (such as documents) as well.
PromptBuilder.__init__
def __init__(template: str,
required_variables: Optional[List[str]] = None,
variables: Optional[List[str]] = None)
Constructs a PromptBuilder component.
Arguments:
template
: A Jinja2 template string that is used to render the prompt, e.g.:"Summarize this document: {{ documents[0].content }}\nSummary:"
required_variables
: An optional list of input variables that must be provided at runtime. If a required variable is not provided at runtime, an exception will be raised.variables
: An optional list of input variables to be used in prompt templates instead of the ones inferred fromtemplate
. For example, if you want to use more variables during prompt engineering than the ones present in the default template, you can provide them here.
PromptBuilder.to_dict
def to_dict() -> Dict[str, Any]
Returns a dictionary representation of the component.
Returns:
Serialized dictionary representation of the component.
PromptBuilder.run
@component.output_types(prompt=str)
def run(template: Optional[str] = None,
template_variables: Optional[Dict[str, Any]] = None,
**kwargs)
Renders the prompt template with the provided variables.
It applies the template variables to render the final prompt. You can provide variables via pipeline kwargs.
In order to overwrite the default template, you can set the template
parameter.
In order to overwrite pipeline kwargs, you can set the template_variables
parameter.
Arguments:
template
: An optional string template to overwrite PromptBuilder's default template. If None, the default template provided at initialization is used.template_variables
: An optional dictionary of template variables to overwrite the pipeline variables.kwargs
: Pipeline variables used for rendering the prompt.
Raises:
ValueError
: If any of the required template variables is not provided.
Returns:
A dictionary with the following keys:
prompt
: The updated prompt text after rendering the prompt template.
Module dynamic_prompt_builder
DynamicPromptBuilder
DynamicPromptBuilder is designed to construct dynamic prompts for the pipeline.
Users can change the prompt template at runtime by providing a new template for each pipeline run invocation if needed.
Usage example:
from typing import List
from haystack.components.builders import DynamicPromptBuilder
from haystack.components.generators import OpenAIGenerator
from haystack import Pipeline, component, Document
from haystack.utils import Secret
prompt_builder = DynamicPromptBuilder(runtime_variables=["documents"])
llm = OpenAIGenerator(api_key=Secret.from_token("<your-api-key>"), model="gpt-3.5-turbo")
@component
class DocumentProducer:
@component.output_types(documents=List[Document])
def run(self, doc_input: str):
return {"documents": [Document(content=doc_input)]}
pipe = Pipeline()
pipe.add_component("doc_producer", DocumentProducer())
pipe.add_component("prompt_builder", prompt_builder)
pipe.add_component("llm", llm)
pipe.connect("doc_producer.documents", "prompt_builder.documents")
pipe.connect("prompt_builder.prompt", "llm.prompt")
template = "Here is the document: {{documents[0].content}} \n Answer: {{query}}"
result = pipe.run(
data={
"doc_producer": {"doc_input": "Hello world, I live in Berlin"},
"prompt_builder": {
"prompt_source": template,
"template_variables": {"query": "Where does the speaker live?"},
},
}
)
print(result)
>> {'llm': {'replies': ['The speaker lives in Berlin.'],
>> 'meta': [{'model': 'gpt-3.5-turbo-0613',
>> 'index': 0,
>> 'finish_reason': 'stop',
>> 'usage': {'prompt_tokens': 28,
>> 'completion_tokens': 6,
>> 'total_tokens': 34}}]}}
Note how in the example above, we can dynamically change the prompt template by providing a new template to the
run method of the pipeline. This dynamic prompt generation is in contrast to the static prompt generation
using `PromptBuilder`, where the prompt template is fixed for the pipeline's lifetime and cannot be changed
for each pipeline run invocation.
<a id="dynamic_prompt_builder.DynamicPromptBuilder.__init__"></a>
#### DynamicPromptBuilder.\_\_init\_\_
```python
def __init__(runtime_variables: Optional[List[str]] = None)
Constructs a DynamicPromptBuilder component.
Arguments:
runtime_variables
: A list of template variable names you can use in prompt construction. For example, ifruntime_variables
contains the stringdocuments
, the component will create an input calleddocuments
of typeAny
. These variable names are used to resolve variables and their values during pipeline execution. The values associated with variables from the pipeline runtime are then injected into template placeholders of a prompt text template that is provided to therun
method.
DynamicPromptBuilder.run
def run(prompt_source: str,
template_variables: Optional[Dict[str, Any]] = None,
**kwargs)
Executes the dynamic prompt building process.
Depending on the provided type of prompt_source
, this method either processes a list of ChatMessage
instances or a string template. In the case of ChatMessage
instances, the last user message is treated as a
template and rendered with the resolved pipeline variables and any additional template variables provided.
For a string template, it directly applies the template variables to render the final prompt. You can provide additional template variables directly to this method, that are then merged with the variables resolved from the pipeline runtime.
Arguments:
prompt_source
: A string template.template_variables
: An optional dictionary of template variables. Template variables provided at initialization are required to resolve pipeline variables, and these are additional variables users can provide directly to this method.kwargs
: Additional keyword arguments, typically resolved from a pipeline, which are merged with the provided template variables.
Returns:
A dictionary with the following keys:
prompt
: The updated prompt text after rendering the string template.
Module dynamic_chat_prompt_builder
DynamicChatPromptBuilder
DynamicChatPromptBuilder is designed to construct dynamic prompts from a list of ChatMessage
instances.
It integrates with Jinja2 templating for dynamic prompt generation. It considers any user or system message in the
list potentially containing a template and renders it with variables provided to the constructor. Additional
template variables can be feed into the component/pipeline `run` method and will be merged before rendering the
template.
Usage example:
```python
from haystack.components.builders import DynamicChatPromptBuilder
from haystack.components.generators.chat import OpenAIChatGenerator
from haystack.dataclasses import ChatMessage
from haystack import Pipeline
from haystack.utils import Secret
# no parameter init, we don't use any runtime template variables
prompt_builder = DynamicChatPromptBuilder()
llm = OpenAIChatGenerator(api_key=Secret.from_token("<your-api-key>"), model="gpt-3.5-turbo")
pipe = Pipeline()
pipe.add_component("prompt_builder", prompt_builder)
pipe.add_component("llm", llm)
pipe.connect("prompt_builder.prompt", "llm.messages")
location = "Berlin"
language = "English"
system_message = ChatMessage.from_system("You are an assistant giving information to tourists in {{language}}")
messages = [system_message, ChatMessage.from_user("Tell me about {{location}}")]
res = pipe.run(data={"prompt_builder": {"template_variables": {"location": location, "language": language},
"prompt_source": messages}})
print(res)
>> {'llm': {'replies': [ChatMessage(content="Berlin is the capital city of Germany and one of the most vibrant
and diverse cities in Europe. Here are some key things to know...Enjoy your time exploring the vibrant and dynamic
capital of Germany!", role=<ChatRole.ASSISTANT: 'assistant'>, name=None, meta={'model': 'gpt-3.5-turbo-0613',
'index': 0, 'finish_reason': 'stop', 'usage': {'prompt_tokens': 27, 'completion_tokens': 681, 'total_tokens':
708}})]}}
messages = [system_message, ChatMessage.from_user("What's the weather forecast for {{location}} in the next
{{day_count}} days?")]
res = pipe.run(data={"prompt_builder": {"template_variables": {"location": location, "day_count": "5"},
"prompt_source": messages}})
print(res)
>> {'llm': {'replies': [ChatMessage(content="Here is the weather forecast for Berlin in the next 5
days:
Day 1: Mostly cloudy with a high of 22°C (72°F) and...so it's always a good idea to check for updates closer to your visit.", role=<ChatRole.ASSISTANT: 'assistant'>, name=None, meta={'model': 'gpt-3.5-turbo-0613', 'index': 0, 'finish_reason': 'stop', 'usage': {'prompt_tokens': 37, 'completion_tokens': 201, 'total_tokens': 238}})]}} ```
Note that the weather forecast in the example above is fictional, but it can be easily connected to a weather
API to provide real weather forecasts.
DynamicChatPromptBuilder.__init__
def __init__(runtime_variables: Optional[List[str]] = None)
Constructs a DynamicChatPromptBuilder component.
Arguments:
runtime_variables
: A list of template variable names you can use in chat prompt construction. For example, ifruntime_variables
contains the stringdocuments
, the component will create an input calleddocuments
of typeAny
. These variable names are used to resolve variables and their values during pipeline execution. The values associated with variables from the pipeline runtime are then injected into template placeholders of a ChatMessage that is provided to therun
method.
DynamicChatPromptBuilder.run
def run(prompt_source: List[ChatMessage],
template_variables: Optional[Dict[str, Any]] = None,
**kwargs)
Executes the dynamic prompt building process by processing a list of ChatMessage
instances.
Any user message or system message is inspected for templates and rendered with the variables provided to the constructor. You can provide additional template variables directly to this method, which are then merged with the variables provided to the constructor.
Arguments:
prompt_source
: A list ofChatMessage
instances. All user and system messages are treated as potentially having templates and are rendered with the provided template variables - if templates are found.template_variables
: A dictionary of template variables. Template variables provided at initialization are required to resolve pipeline variables, and these are additional variables users can provide directly to this method.kwargs
: Additional keyword arguments, typically resolved from a pipeline, which are merged with the provided template variables.
Raises:
ValueError
: Ifchat_messages
is empty or contains elements that are not instances ofChatMessage
.ValueError
: If the last message inchat_messages
is not from a user.
Returns:
A dictionary with the following keys:
prompt
: The updated list ofChatMessage
instances after rendering the found templates.
Module chat_prompt_builder
ChatPromptBuilder
ChatPromptBuilder is a component that renders a chat prompt from a template string using Jinja2 templates.
It is designed to construct prompts for the pipeline using static or dynamic templates: Users can change
the prompt template at runtime by providing a new template for each pipeline run invocation if needed.
The template variables found in the init template string are used as input types for the component and are all optional,
unless explicitly specified. If an optional template variable is not provided as an input, it will be replaced with
an empty string in the rendered prompt. Use `variable` and `required_variables` to specify the input types and
required variables.
Usage example with static prompt template:
```python
template = [ChatMessage.from_user("Translate to {{ target_language }}. Context: {{ snippet }}; Translation:")]
builder = ChatPromptBuilder(template=template)
builder.run(target_language="spanish", snippet="I can't speak spanish.")
```
Usage example of overriding the static template at runtime:
```python
template = [ChatMessage.from_user("Translate to {{ target_language }}. Context: {{ snippet }}; Translation:")]
builder = ChatPromptBuilder(template=template)
builder.run(target_language="spanish", snippet="I can't speak spanish.")
summary_template = [ChatMessage.from_user("Translate to {{ target_language }} and summarize. Context: {{ snippet }}; Summary:")]
builder.run(target_language="spanish", snippet="I can't speak spanish.", template=summary_template)
```
Usage example with dynamic prompt template:
```python
from haystack.components.builders import ChatPromptBuilder
from haystack.components.generators.chat import OpenAIChatGenerator
from haystack.dataclasses import ChatMessage
from haystack import Pipeline
from haystack.utils import Secret
# no parameter init, we don't use any runtime template variables
prompt_builder = ChatPromptBuilder()
llm = OpenAIChatGenerator(api_key=Secret.from_token("<your-api-key>"), model="gpt-3.5-turbo")
pipe = Pipeline()
pipe.add_component("prompt_builder", prompt_builder)
pipe.add_component("llm", llm)
pipe.connect("prompt_builder.prompt", "llm.messages")
location = "Berlin"
language = "English"
system_message = ChatMessage.from_system("You are an assistant giving information to tourists in {{language}}")
messages = [system_message, ChatMessage.from_user("Tell me about {{location}}")]
res = pipe.run(data={"prompt_builder": {"template_variables": {"location": location, "language": language},
"template": messages}})
print(res)
>> {'llm': {'replies': [ChatMessage(content="Berlin is the capital city of Germany and one of the most vibrant
and diverse cities in Europe. Here are some key things to know...Enjoy your time exploring the vibrant and dynamic
capital of Germany!", role=<ChatRole.ASSISTANT: 'assistant'>, name=None, meta={'model': 'gpt-3.5-turbo-0613',
'index': 0, 'finish_reason': 'stop', 'usage': {'prompt_tokens': 27, 'completion_tokens': 681, 'total_tokens':
708}})]}}
messages = [system_message, ChatMessage.from_user("What's the weather forecast for {{location}} in the next
{{day_count}} days?")]
res = pipe.run(data={"prompt_builder": {"template_variables": {"location": location, "day_count": "5"},
"template": messages}})
print(res)
>> {'llm': {'replies': [ChatMessage(content="Here is the weather forecast for Berlin in the next 5
days:
Day 1: Mostly cloudy with a high of 22°C (72°F) and...so it's always a good idea to check for updates closer to your visit.", role=<ChatRole.ASSISTANT: 'assistant'>, name=None, meta={'model': 'gpt-3.5-turbo-0613', 'index': 0, 'finish_reason': 'stop', 'usage': {'prompt_tokens': 37, 'completion_tokens': 201, 'total_tokens': 238}})]}} ```
Note how in the example above, we can dynamically change the prompt template by providing a new template to the
run method of the pipeline.
ChatPromptBuilder.__init__
def __init__(template: Optional[List[ChatMessage]] = None,
required_variables: Optional[List[str]] = None,
variables: Optional[List[str]] = None)
Constructs a ChatPromptBuilder component.
Arguments:
template
: A list ofChatMessage
instances. All user and system messages are treated as potentially having jinja2 templates and are rendered with the provided template variables. If not provided, the template must be provided at runtime using thetemplate
parameter of therun
method.required_variables
: An optional list of input variables that must be provided at all times. If not provided, an exception will be raised.variables
: A list of template variable names you can use in prompt construction. For example, ifvariables
contains the stringdocuments
, the component will create an input calleddocuments
of typeAny
. These variable names are used to resolve variables and their values during pipeline execution. The values associated with variables from the pipeline runtime are then injected into template placeholders of a prompt text template that is provided to therun
method. If not provided, variables are inferred fromtemplate
.
ChatPromptBuilder.run
@component.output_types(prompt=List[ChatMessage])
def run(template: Optional[List[ChatMessage]] = None,
template_variables: Optional[Dict[str, Any]] = None,
**kwargs)
Executes the prompt building process.
It applies the template variables to render the final prompt. You can provide variables either via pipeline
(set through variables
or inferred from template
at initialization) or via additional template variables
set directly to this method. On collision, the variables provided directly to this method take precedence.
Arguments:
template
: An optional list of ChatMessages to overwrite ChatPromptBuilder's default template. If None, the default template provided at initialization is used.template_variables
: An optional dictionary of template variables. These are additional variables users can provide directly to this method in contrast to pipeline variables.kwargs
: Pipeline variables (typically resolved from a pipeline) which are merged with the provided template variables.
Raises:
ValueError
: Ifchat_messages
is empty or contains elements that are not instances ofChatMessage
.
Returns:
A dictionary with the following keys:
prompt
: The updated list ofChatMessage
instances after rendering the found templates.