AmazonBedrockChatGenerator
This component enables chat completion using models through Amazon Bedrock service.
Most common position in a pipeline | After a ChatPromptBuilder |
Mandatory init variables | "model": The model to use "aws_access_key_id": AWS access key ID. Can be set with AWS_ACCESS_KEY_ID env var."aws_secret_access_key": AWS secret access key. Can be set with AWS_SECRET_ACCESS_KEY env var."aws_region_name": AWS region name. Can be set with AWS_DEFAULT_REGION env var. |
Mandatory run variables | “messages”: A list of ChatMessage instances |
Output variables | "replies": A list of ChatMessage objects”meta”: A list of dictionaries with the metadata associated with each reply, such as token count, finish reason, and so on |
API reference | Amazon Bedrock |
GitHub link | https://github.com/deepset-ai/haystack-core-integrations/tree/main/integrations/amazon_bedrock |
Amazon Bedrock is a fully managed service that makes high-performing foundation models from leading AI startups and Amazon available through a unified API. You can choose from various foundation models to find the one best suited for your use case.
AmazonBedrockChatGenerator
enables chat completion using chat models from Anthropic, Cohere, Meta Llama 2, and Mistral with a single component.
The models that we currently support are Anthropic's Claude, Meta's Llama 2, and Mistral, but as more chat models are added, their support will be provided through AmazonBedrockChatGenerator
.
Overview
This component uses AWS for authentication. You can use the AWS CLI to authenticate through your IAM. For more information on setting up an IAM identity-based policy, see the official documentation.
Using AWS CLI
Consider using AWS CLI as a more straightforward tool to manage your AWS services. With AWS CLI, you can quickly configure your boto3 credentials. This way, you won't need to provide detailed authentication parameters when initializing Amazon Bedrock Generator in Haystack.
To use this component for text generation, initialize an AmazonBedrockGenerator with the model name, the AWS credentials (AWS_ACCESS_KEY_ID
, AWS_SECRET_ACCESS_KEY
, AWS_DEFAULT_REGION
) should be set as environment variables, be configured as described above or passed as Secret arguments. Note, make sure the region you set supports Amazon Bedrock.
To start using Amazon Bedrock with Haystack, install the amazon-bedrock-haystack
package:
pip install amazon-bedrock-haystack
Streaming
This Generator supports streaming the tokens from the LLM directly in output. To do so, pass a function to the streaming_callback
init parameter.
Usage
On its own
Basic usage:
from haystack_integrations.components.generators.amazon_bedrock import AmazonBedrockChatGenerator
from haystack.dataclasses import ChatMessage
generator = AmazonBedrockChatGenerator(model="meta.llama2-70b-chat-v1")
messages = [ChatMessage.from_system("You are a helpful assistant that answers question in Spanish only"), ChatMessage.from_user("What's Natural Language Processing? Be brief.")]
response = generator.run(messages)
print(response)
# >>> {'replies': [ChatMessage(content=' Procesamiento del Lenguaje Natural (PLN) es una rama de la inteligencia artificial que se enfoca en el desarrollo de algoritmos y modelos computacionales para analizar, comprender y generar texto y lenguaje humano.', role=<ChatRole.ASSISTANT: 'assistant'>, name=None, meta={'prompt_token_count': 46, 'generation_token_count': 60, 'stop_reason': 'stop'})]}
In a pipeline
In a RAG pipeline:
from haystack import Pipeline
from haystack.components.builders import ChatPromptBuilder
from haystack.dataclasses import ChatMessage
from haystack_integrations.components.generators.amazon_bedrock import AmazonBedrockChatGenerator
pipe = Pipeline()
pipe.add_component("prompt_builder", ChatPromptBuilder())
pipe.add_component("llm", AmazonBedrockChatGenerator(model="meta.llama2-70b-chat-v1"))
pipe.connect("prompt_builder", "llm")
country = "Germany"
system_message = ChatMessage.from_system("You are an assistant giving out valuable information to language learners.")
messages = [system_message, ChatMessage.from_user("What's the official language of {{ country }}?")]
res = pipe.run(data={"prompt_builder": {"template_variables": {"country": country}, "template": messages}})
print(res)
# {'llm': {'replies': [ChatMessage(content=' The official language of Germany is German. There are several dialects
# of German spoken throughout the country, but the standard form of German used in government, education, and media
# is called "High German" or "Hochdeutsch." This is the variety of German that is taught in schools and used in formal
# situations. There are also several recognized minority languages in Germany, including Low German, Sorbian, and
# Frisian.', role=<ChatRole.ASSISTANT: 'assistant'>, name=None, meta={'prompt_token_count': 44,
# 'generation_token_count': 89, 'stop_reason': 'stop'})]}}
Updated 3 months ago
Check out the API reference in the GitHub repo or in our docs: