ChatPromptBuilder
This component constructs prompts dynamically by processing chat messages.
Name | ChatPromptBuilder |
Folder path | /builders/ |
Most common position in a pipeline | Before a Generator |
Mandatory input variables | “**kwargs”: Any strings that should be used to render the prompt template |
Output variables | “prompt”: A dynamically constructed prompt |
Overview
ChatPromptBuilder
generates prompts dynamically by processing a list of ChatMessage
instances. It integrates with Jinja2 templating.
It constructs prompts for the pipeline using static or dynamic templates. Users can change
the prompt template at runtime by providing a new template for each pipeline run invocation if needed.
ChatMessage
is a data class that includes message content, a role (who generated the message, such as user
, assistant
, system
, function
), and optional metadata.
Variables
The template variables found in the init template are used as input types for the component and are all optional unless explicitly specified. If an optional template variable is not provided as an input, it will be replaced with an empty string in the rendered prompt. Use variable
and required_variables
to specify the input types and required variables.
Usage
On its own
With static template
from haystack.components.builders import ChatPromptBuilder
from haystack.dataclasses import ChatMessage
template = [ChatMessage.from_user("Translate to {{ target_language }}. Context: {{ snippet }}; Translation:")]
builder = ChatPromptBuilder(template=template)
builder.run(target_language="spanish", snippet="I can't speak spanish.")
Overriding static template at runtime
from haystack.components.builders import ChatPromptBuilder
from haystack.dataclasses import ChatMessage
template = [ChatMessage.from_user("Translate to {{ target_language }}. Context: {{ snippet }}; Translation:")]
builder = ChatPromptBuilder(template=template)
builder.run(target_language="spanish", snippet="I can't speak spanish.")
summary_template = [ChatMessage.from_user("Translate to {{ target_language }} and summarize. Context: {{ snippet }}; Summary:")]
builder.run(target_language="spanish", snippet="I can't speak spanish.", template=summary_template)
In a pipeline
from haystack.components.builders import ChatPromptBuilder
from haystack.components.generators.chat import OpenAIChatGenerator
from haystack.dataclasses import ChatMessage
from haystack import Pipeline
from haystack.utils import Secret
# no parameter init, we don't use any runtime template variables
prompt_builder = ChatPromptBuilder()
llm = OpenAIChatGenerator()
pipe = Pipeline()
pipe.add_component("prompt_builder", prompt_builder)
pipe.add_component("llm", llm)
pipe.connect("prompt_builder.prompt", "llm.messages")
location = "Berlin"
language = "English"
system_message = ChatMessage.from_system("You are an assistant giving information to tourists in {{language}}")
messages = [system_message, ChatMessage.from_user("Tell me about {{location}}")]
res = pipe.run(data={"prompt_builder": {"template_variables": {"location": location, "language": language},
"template": messages}})
print(res)
>> {'llm': {'replies': [ChatMessage(content="Berlin is the capital city of Germany and one of the most vibrant
>> and diverse cities in Europe. Here are some key things to know...Enjoy your time exploring the vibrant and dynamic
>> capital of Germany!", role=<ChatRole.ASSISTANT: 'assistant'>, name=None, meta={'model': 'gpt-3.5-turbo-0613',
>> 'index': 0, 'finish_reason': 'stop', 'usage': {'prompt_tokens': 27, 'completion_tokens': 681, 'total_tokens':
>> 708}})]}}
Then, you could ask about the weather forecast for the said location. The ChatPromptBuilder
fills in the template with the new day_count
variable and passes it to an LLM once again:
from haystack.components.builders import ChatPromptBuilder
from haystack.components.generators.chat import OpenAIChatGenerator
from haystack.dataclasses import ChatMessage
from haystack import Pipeline
from haystack.utils import Secret
# no parameter init, we don't use any runtime template variables
prompt_builder = ChatPromptBuilder()
llm = OpenAIChatGenerator()
pipe = Pipeline()
pipe.add_component("prompt_builder", prompt_builder)
pipe.add_component("llm", llm)
pipe.connect("prompt_builder.prompt", "llm.messages")
location = "Berlin"
messages = [system_message, ChatMessage.from_user("What's the weather forecast for {{location}} in the next {{day_count}} days?")]
res = pipe.run(data={"prompt_builder": {"template_variables": {"location": location, "day_count": "5"},
"template": messages}})
print(res)
>> {'llm': {'replies': [ChatMessage(content="Here is the weather forecast for Berlin in the next 5
>> days:\n\nDay 1: Mostly cloudy with a high of 22°C (72°F) and...so it's always a good idea to check for updates
>> closer to your visit.", role=<ChatRole.ASSISTANT: 'assistant'>, name=None, meta={'model': 'gpt-3.5-turbo-0613',
>> 'index': 0, 'finish_reason': 'stop', 'usage': {'prompt_tokens': 37, 'completion_tokens': 201,
>> 'total_tokens': 238}})]}}
Updated 6 months ago