DocumentationAPI ReferenceπŸ““ TutorialsπŸ§‘β€πŸ³ Cookbook🀝 IntegrationsπŸ’œ Discord

Anthropic integration for Haystack

Module haystack_integrations.components.generators.anthropic.generator

AnthropicGenerator

@component
class AnthropicGenerator()

Enables text generation using Anthropic large language models (LLMs). It supports the Claude family of models.

Although Anthropic natively supports a much richer messaging API, we have intentionally simplified it in this component so that the main input/output interface is string-based. For more complete support, consider using the AnthropicChatGenerator.

from haystack_integrations.components.generators.anthropic import AnthropicGenerator

client = AnthropicGenerator(model="claude-2.1")
response = client.run("What's Natural Language Processing? Be brief.")
print(response)
>>{'replies': ['Natural language processing (NLP) is a branch of artificial intelligence focused on enabling
>>computers to understand, interpret, and manipulate human language. The goal of NLP is to read, decipher,
>> understand, and make sense of the human languages in a manner that is valuable.'], 'meta': {'model':
>> 'claude-2.1', 'index': 0, 'finish_reason': 'end_turn', 'usage': {'input_tokens': 18, 'output_tokens': 58}}}

AnthropicGenerator.__init__

def __init__(api_key: Secret = Secret.from_env_var("ANTHROPIC_API_KEY"),
             model: str = "claude-3-sonnet-20240229",
             streaming_callback: Optional[Callable[[StreamingChunk],
                                                   None]] = None,
             system_prompt: Optional[str] = None,
             generation_kwargs: Optional[Dict[str, Any]] = None)

Initialize the AnthropicGenerator.

Arguments:

  • api_key: The Anthropic API key.
  • model: The name of the Anthropic model to use.
  • streaming_callback: An optional callback function to handle streaming chunks.
  • system_prompt: An optional system prompt to use for generation.
  • generation_kwargs: Additional keyword arguments for generation.

AnthropicGenerator.to_dict

def to_dict() -> Dict[str, Any]

Serialize this component to a dictionary.

Returns:

The serialized component as a dictionary.

AnthropicGenerator.from_dict

@classmethod
def from_dict(cls, data: Dict[str, Any]) -> "AnthropicGenerator"

Deserialize this component from a dictionary.

Arguments:

  • data: The dictionary representation of this component.

Returns:

The deserialized component instance.

AnthropicGenerator.run

@component.output_types(replies=List[str], meta=List[Dict[str, Any]])
def run(prompt: str, generation_kwargs: Optional[Dict[str, Any]] = None)

Generate replies using the Anthropic API.

Arguments:

  • prompt: The input prompt for generation.
  • generation_kwargs: Additional keyword arguments for generation.

Returns:

A dictionary containing:

  • replies: A list of generated replies.
  • meta: A list of metadata dictionaries for each reply.

Module haystack_integrations.components.generators.anthropic.chat.chat_generator

AnthropicChatGenerator

@component
class AnthropicChatGenerator()

Enables text generation using Anthropic state-of-the-art Claude 3 family of large language models (LLMs) through the Anthropic messaging API.

It supports models like claude-3-opus, claude-3-sonnet, and claude-3-haiku, accessed through the /v1/messages API endpoint using the Claude v2.1 messaging version.

Users can pass any text generation parameters valid for the Anthropic messaging API directly to this component via the generation_kwargs parameter in __init__ or the generation_kwargs parameter in the run method.

For more details on the parameters supported by the Anthropic API, refer to the Anthropic Message API documentation.

For more details on supported models and their capabilities, refer to the Anthropic documentation.

Note: We don't yet support vision capabilities in the current implementation.

from haystack_integrations.components.generators.anthropic import AnthropicChatGenerator
from haystack.dataclasses import ChatMessage

messages = [ChatMessage.from_user("What's Natural Language Processing?")]
client = AnthropicChatGenerator(model="claude-3-sonnet-20240229")
response = client.run(messages)
print(response)

>> {'replies': [ChatMessage(content='Natural Language Processing (NLP) is a field of artificial intelligence that
>> focuses on enabling computers to understand, interpret, and generate human language. It involves developing
>> techniques and algorithms to analyze and process text or speech data, allowing machines to comprehend and
>> communicate in natural languages like English, Spanish, or Chinese.', role=<ChatRole.ASSISTANT: 'assistant'>,
>> name=None, meta={'model': 'claude-3-sonnet-20240229', 'index': 0, 'finish_reason': 'end_turn',
>> 'usage': {'input_tokens': 15, 'output_tokens': 64}})]}

AnthropicChatGenerator.__init__

def __init__(api_key: Secret = Secret.from_env_var("ANTHROPIC_API_KEY"),
             model: str = "claude-3-sonnet-20240229",
             streaming_callback: Optional[Callable[[StreamingChunk],
                                                   None]] = None,
             generation_kwargs: Optional[Dict[str, Any]] = None)

Creates an instance of AnthropicChatGenerator.

Arguments:

  • api_key: The Anthropic API key
  • model: The name of the model to use.
  • streaming_callback: A callback function that is called when a new token is received from the stream. The callback function accepts StreamingChunk as an argument.
  • generation_kwargs: Other parameters to use for the model. These parameters are all sent directly to the Anthropic endpoint. See Anthropic documentation for more details.

Supported generation_kwargs parameters are:

  • system: The system message to be passed to the model.
  • max_tokens: The maximum number of tokens to generate.
  • metadata: A dictionary of metadata to be passed to the model.
  • stop_sequences: A list of strings that the model should stop generating at.
  • temperature: The temperature to use for sampling.
  • top_p: The top_p value to use for nucleus sampling.
  • top_k: The top_k value to use for top-k sampling.

AnthropicChatGenerator.to_dict

def to_dict() -> Dict[str, Any]

Serialize this component to a dictionary.

Returns:

The serialized component as a dictionary.

AnthropicChatGenerator.from_dict

@classmethod
def from_dict(cls, data: Dict[str, Any]) -> "AnthropicChatGenerator"

Deserialize this component from a dictionary.

Arguments:

  • data: The dictionary representation of this component.

Returns:

The deserialized component instance.

AnthropicChatGenerator.run

@component.output_types(replies=List[ChatMessage])
def run(messages: List[ChatMessage],
        generation_kwargs: Optional[Dict[str, Any]] = None)

Invoke the text generation inference based on the provided messages and generation parameters.

Arguments:

  • messages: A list of ChatMessage instances representing the input messages.
  • generation_kwargs: Additional keyword arguments for text generation. These parameters will potentially override the parameters passed in the __init__ method. For more details on the parameters supported by the Anthropic API, refer to the Anthropic documentation.

Returns:

  • replies: A list of ChatMessage instances representing the generated responses.