Extract the output of a Generator to an Answer format, and build prompts.
Module haystack_experimental.components.builders.chat_prompt_builder
ChatPromptBuilder
Renders a chat prompt from a template string using Jinja2 syntax.
It constructs prompts using static or dynamic templates, which you can update for each pipeline run.
Template variables in the template are optional unless specified otherwise.
If an optional variable isn't provided, it defaults to an empty string. Use variable
and required_variables
to define input types and required variables.
Usage examples
With static prompt template
template = [ChatMessage.from_user("Translate to {{ target_language }}. Context: {{ snippet }}; Translation:")]
builder = ChatPromptBuilder(template=template)
builder.run(target_language="spanish", snippet="I can't speak spanish.")
Overriding static template at runtime
template = [ChatMessage.from_user("Translate to {{ target_language }}. Context: {{ snippet }}; Translation:")]
builder = ChatPromptBuilder(template=template)
builder.run(target_language="spanish", snippet="I can't speak spanish.")
msg = "Translate to {{ target_language }} and summarize. Context: {{ snippet }}; Summary:"
summary_template = [ChatMessage.from_user(msg)]
builder.run(target_language="spanish", snippet="I can't speak spanish.", template=summary_template)
With dynamic prompt template
from haystack.components.builders import ChatPromptBuilder
from haystack.components.generators.chat import OpenAIChatGenerator
from haystack.dataclasses import ChatMessage
from haystack import Pipeline
from haystack.utils import Secret
# no parameter init, we don't use any runtime template variables
prompt_builder = ChatPromptBuilder()
llm = OpenAIChatGenerator(api_key=Secret.from_token("<your-api-key>"), model="gpt-4o-mini")
pipe = Pipeline()
pipe.add_component("prompt_builder", prompt_builder)
pipe.add_component("llm", llm)
pipe.connect("prompt_builder.prompt", "llm.messages")
location = "Berlin"
language = "English"
system_message = ChatMessage.from_system("You are an assistant giving information to tourists in {{language}}")
messages = [system_message, ChatMessage.from_user("Tell me about {{location}}")]
res = pipe.run(data={"prompt_builder": {"template_variables": {"location": location, "language": language},
"template": messages}})
print(res)
>> {'llm': {'replies': [ChatMessage(content="Berlin is the capital city of Germany and one of the most vibrant
and diverse cities in Europe. Here are some key things to know...Enjoy your time exploring the vibrant and dynamic
capital of Germany!", role=<ChatRole.ASSISTANT: 'assistant'>, name=None, meta={'model': 'gpt-4o-mini',
'index': 0, 'finish_reason': 'stop', 'usage': {'prompt_tokens': 27, 'completion_tokens': 681, 'total_tokens':
708}})]}}
messages = [system_message, ChatMessage.from_user("What's the weather forecast for {{location}} in the next
{{day_count}} days?")]
res = pipe.run(data={"prompt_builder": {"template_variables": {"location": location, "day_count": "5"},
"template": messages}})
print(res)
>> {'llm': {'replies': [ChatMessage(content="Here is the weather forecast for Berlin in the next 5
days:\n\nDay 1: Mostly cloudy with a high of 22°C (72°F) and...so it's always a good idea to check for updates
closer to your visit.", role=<ChatRole.ASSISTANT: 'assistant'>, name=None, meta={'model': 'gpt-4o-mini',
'index': 0, 'finish_reason': 'stop', 'usage': {'prompt_tokens': 37, 'completion_tokens': 201,
'total_tokens': 238}})]}}
ChatPromptBuilder.__init__
def __init__(template: Optional[List[ChatMessage]] = None,
required_variables: Optional[Union[List[str],
Literal["*"]]] = None,
variables: Optional[List[str]] = None)
Constructs a ChatPromptBuilder component.
Arguments:
template
: A list ofChatMessage
objects. The component looks for Jinja2 template syntax and renders the prompt with the provided variables. Provide the template in either theinit
methodor the
run` method.required_variables
: List variables that must be provided as input to ChatPromptBuilder. If a variable listed as required is not provided, an exception is raised. If set to "*", all variables found in the prompt are required.variables
: List input variables to use in prompt templates instead of the ones inferred from thetemplate
parameter. For example, to use more variables during prompt engineering than the ones present in the default template, you can provide them here.
ChatPromptBuilder.run
@component.output_types(prompt=List[ChatMessage])
def run(template: Optional[List[ChatMessage]] = None,
template_variables: Optional[Dict[str, Any]] = None,
**kwargs)
Renders the prompt template with the provided variables.
It applies the template variables to render the final prompt. You can provide variables with pipeline kwargs.
To overwrite the default template, you can set the template
parameter.
To overwrite pipeline kwargs, you can set the template_variables
parameter.
Arguments:
template
: An optional list ofChatMessage
objects to overwrite ChatPromptBuilder's default template. IfNone
, the default template provided at initialization is used.template_variables
: An optional dictionary of template variables to overwrite the pipeline variables.kwargs
: Pipeline variables used for rendering the prompt.
Raises:
ValueError
: Ifchat_messages
is empty or contains elements that are not instances ofChatMessage
.
Returns:
A dictionary with the following keys:
prompt
: The updated list ofChatMessage
objects after rendering the templates.
ChatPromptBuilder.to_dict
def to_dict() -> Dict[str, Any]
Returns a dictionary representation of the component.
Returns:
Serialized dictionary representation of the component.
ChatPromptBuilder.from_dict
@classmethod
def from_dict(cls, data: Dict[str, Any]) -> "ChatPromptBuilder"
Deserialize this component from a dictionary.
Arguments:
data
: The dictionary to deserialize and create the component.
Returns:
The deserialized component.