DocumentationAPI Reference📓 Tutorials🧑‍🍳 Cookbook🤝 Integrations💜 Discord


This component enables chat completion using models through Amazon Bedrock service.

Most common Position in a PipelineAfter DynamicChatPromptBuilder
Mandatory Input variables“messages”: a list of ChatMessage instances
Output variables"replies": a list of ChatMessage objects

”meta”: a list of dictionaries with the metadata associated with each reply, such as token count, finish reason, and so on

Amazon Bedrock is a fully managed service that makes high-performing foundation models from leading AI startups and Amazon available through a unified API. You can choose from various foundation models to find the one best suited for your use case.

AmazonBedrockChatGenerator enables chat completion using chat models from Anthropic, Cohere, and Meta Llama 2 with a single component.

The models that we currently support are Anthropic's Claude 2 models, Claude 3 Sonnet and Meta's Llama 2, but as more chat models are added, their support will be provided through AmazonBedrockChatGenerator.


This component uses AWS for authentication. You can use the AWS CLI to authenticate through your IAM. For more information on setting up an IAM identity-based policy, see the official documentation.



Consider using AWS CLI as a more straightforward tool to manage your AWS services. With AWS CLI, you can quickly configure your boto3 credentials. This way, you won't need to provide detailed authentication parameters when initializing Amazon Bedrock Generator in Haystack.

To use this component for text generation, initialize an AmazonBedrockGenerator with the model name, the AWS credentials (AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY, AWS_DEFAULT_REGION) should be set as environment variables, be configured as described above or passed as Secret arguments. Note, make sure the region you set supports Amazon Bedrock.

To start using Amazon Bedrock with Haystack, install the amazon-bedrock-haystack package:

pip install amazon-bedrock-haystack


On its own

Basic usage:

from haystack_integrations.components.generators.amazon_bedrock import AmazonBedrockChatGenerator
from haystack.dataclasses import ChatMessage

generator = AmazonBedrockChatGenerator(model="meta.llama2-70b-chat-v1")
messages = [ChatMessage.from_system("You are a helpful assistant that answers question in Spanish only"), ChatMessage.from_user("What's Natural Language Processing? Be brief.")]
response =

# >>> {'replies': [ChatMessage(content='  Procesamiento del Lenguaje Natural (PLN) es una rama de la inteligencia artificial que se enfoca en el desarrollo de algoritmos y modelos computacionales para analizar, comprender y generar texto y lenguaje humano.', role=<ChatRole.ASSISTANT: 'assistant'>, name=None, meta={'prompt_token_count': 46, 'generation_token_count': 60, 'stop_reason': 'stop'})]}

In a pipeline

In a RAG pipeline:

from haystack import Pipeline
from import DynamicChatPromptBuilder
from haystack.dataclasses import ChatMessage
from haystack_integrations.components.generators.amazon_bedrock import AmazonBedrockChatGenerator

pipe = Pipeline()
pipe.add_component("prompt_builder", DynamicChatPromptBuilder())
pipe.add_component("llm", AmazonBedrockChatGenerator(model="meta.llama2-70b-chat-v1"))
pipe.connect("prompt_builder", "llm")

country = "Germany"
system_message = ChatMessage.from_system("You are an assistant giving out valuable information to language learners.")
messages = [system_message, ChatMessage.from_user("What's the official language of {{ country }}?")]

res ={"prompt_builder": {"template_variables": {"country": country}, "prompt_source": messages}})

# {'llm': {'replies': [ChatMessage(content='  The official language of Germany is German. There are several dialects 
# of German spoken throughout the country, but the standard form of German used in government, education, and media 
# is called "High German" or "Hochdeutsch." This is the variety of German that is taught in schools and used in formal 
# situations. There are also several recognized minority languages in Germany, including Low German, Sorbian, and 
# Frisian.', role=<ChatRole.ASSISTANT: 'assistant'>, name=None, meta={'prompt_token_count': 44, 
# 'generation_token_count': 89, 'stop_reason': 'stop'})]}}

Related Links

Check out the API reference in the GitHub repo or in our docs: