DocumentationAPI Reference📓 Tutorials🧑‍🍳 Cookbook🤝 Integrations💜 Discord🎨 Studio (Waitlist)
Documentation

AmazonBedrockGenerator

This component enables text generation using models through Amazon Bedrock service.

Most common position in a pipelineAfter a PromptBuilder
Mandatory init variables"model": The model to use

"aws_access_key_id": AWS access key ID. Can be set with AWS_ACCESS_KEY_ID env var.

"aws_secret_access_key": AWS secret access key. Can be set with AWS_SECRET_ACCESS_KEY env var.

"aws_region_name": AWS region name. Can be set with AWS_DEFAULT_REGION env var.
Mandatory run variables“prompt”: The instructions for the Generator
Output variables“replies”: A list of strings with all the replies generated by the model
API referenceAmazon Bedrock
GitHub linkhttps://github.com/deepset-ai/haystack-core-integrations/tree/main/integrations/amazon_bedrock

Amazon Bedrock is a fully managed service that makes high-performing foundation models from leading AI startups and Amazon available through a unified API. You can choose from various foundation models to find the one best suited for your use case.

AmazonBedrockGenerator enables text generation using models from AI21 Labs, Anthropic, Cohere, Meta, Stability AI, and Amazon with a single component.

The models that we currently support are Anthropic's Claude, AI21 Labs' Jurassic-2, Stability AI's Stable Diffusion, Cohere's Command and Embed, Meta's Llama 2, and the Amazon Titan language and embeddings models.

Overview

This component uses AWS for authentication. You can use the AWS CLI to authenticate through your IAM. For more information on setting up an IAM identity-based policy, see the official documentation.

📘

Using AWS CLI

Consider using AWS CLI as a more straightforward tool to manage your AWS services. With AWS CLI, you can quickly configure your boto3 credentials. This way, you won't need to provide detailed authentication parameters when initializing Amazon Bedrock Generator in Haystack.

To use this component for text generation, initialize an AmazonBedrockGenerator with the model name, the AWS credentials (AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY, AWS_DEFAULT_REGION) should be set as environment variables, be configured as described above or passed as Secret arguments. Note, make sure the region you set supports Amazon Bedrock.

To start using Amazon Bedrock with Haystack, install the amazon-bedrock-haystack package:

pip install amazon-bedrock-haystack

Streaming

This Generator supports streaming the tokens from the LLM directly in output. To do so, pass a function to the streaming_callback init parameter.

Usage

On its own

Basic usage:

from haystack_integrations.components.generators.amazon_bedrock import AmazonBedrockGenerator

aws_access_key_id="..."
aws_secret_access_key="..."
aws_region_name="eu-central-1"

generator = AmazonBedrockGenerator(model="anthropic.claude-v2")
result = generator.run("Who is the best American actor?")
for reply in result["replies"]:
    print(reply)

# >>> 'There is no definitive "best" American actor, as acting skill and talent a# re subjective. However, some of the most acclaimed and influential American act# ors include Tom Hanks, Daniel Day-Lewis, Denzel Washington, Meryl Streep, Rober# t De Niro, Al Pacino, Marlon Brando, Jack Nicholson, Leonardo DiCaprio and John# ny Depp. Choosing a single "best" actor comes down to personal preference.'

In a pipeline

In a RAG pipeline:

from haystack.components.retrievers.in_memory import InMemoryBM25Retriever
from haystack.components.builders import PromptBuilder
from haystack.document_stores.in_memory import InMemoryDocumentStore
from haystack import Pipeline

from haystack_integrations.components.generators.amazon_bedrock import AmazonBedrockGenerator

template = """
Given the following information, answer the question.

Context: 
{% for document in documents %}
    {{ document.content }}
{% endfor %}

Question: What's the official language of {{ country }}?
"""

aws_access_key_id="..."
aws_secret_access_key="..."
aws_region_name="eu-central-1"
generator = AmazonBedrockGenerator(model="anthropic.claude-v2")
docstore = InMemoryDocumentStore()

pipe = Pipeline()
pipe.add_component("retriever", InMemoryBM25Retriever(document_store=docstore))
pipe.add_component("prompt_builder", PromptBuilder(template=template))
pipe.add_component("generator", generator)
pipe.connect("retriever", "prompt_builder.documents")
pipe.connect("prompt_builder", "generator")

pipe.run({
    "retriever":{ "query": "France"},
    "prompt_builder": {
        "country": "France"
    }
})

# {'generator': {'replies': ['Based on the context provided, the official language of France is French.']}}

Additional References

🧑‍🍳 Cookbook: PDF-Based Question Answering with Amazon Bedrock and Haystack