DocumentationAPI Reference📓 Tutorials🧑‍🍳 Cookbook🤝 Integrations💜 Discord🎨 Studio (Waitlist)
Documentation

CohereChatGenerator

CohereChatGenerator enables chat completions using Cohere's large language models (LLMs).

Most common position in a pipelineAfter a ChatPromptBuilder
Mandatory init variables"api_key": The Cohere API key. Can be set with COHERE_API_KEY or CO_API_KEY env var.
Mandatory run variables“messages” A list of ChatMessage objects
Output variables"replies": A list of ChatMessage objects

”meta”: A list of dictionaries with the metadata associated with each reply, such as token count, finish reason, and so on
API referenceCohere
GitHub linkhttps://github.com/deepset-ai/haystack-core-integrations/tree/main/integrations/cohere

This integration supports Cohere chat models such as command,command-r and comman-r-plus. Check out the most recent full list in Cohere documentation.

Overview

CohereChatGenerator needs a Cohere API key to work. You can set this key in:

  • The api_key init parameter using Secret API
  • The COHERE_API_KEY environment variable (recommended)

Then, the component needs a prompt to operate, but you can pass any text generation parameters valid for the Co.chat method directly to this component using the generation_kwargs parameter, both at initialization and to run() method. For more details on the parameters supported by the Cohere API, refer to the Cohere documentation.

Finally, the component needs a list of ChatMessage objects to operate. ChatMessage is a data class that contains a message, a role (who generated the message, such as user, assistant, system, function), and optional metadata.

Streaming

This Generator supports streaming the tokens from the LLM directly in output. To do so, pass a function to the streaming_callback init parameter.

Usage

You need to install cohere-haystack package to use the CohereChatGenerator:

pip install cohere-haystack

On its own

from haystack_integrations.components.generators.cohere import CohereChatGenerator
from haystack.dataclasses import ChatMessage

generator = CohereChatGenerator()
message = ChatMessage.from_user("What's Natural Language Processing? Be brief.")
print(generator.run([message]))

In a Pipeline

You can also use CohereChatGenerator to use cohere chat models in your pipeline.

from haystack import Pipeline
from haystack.components.builders import ChatPromptBuilder
from haystack.dataclasses import ChatMessage
from haystack_integrations.components.generators.cohere import CohereChatGenerator
from haystack.utils import Secret

pipe = Pipeline()
pipe.add_component("prompt_builder", ChatPromptBuilder())
pipe.add_component("llm", CohereChatGenerator())
pipe.connect("prompt_builder", "llm")

country = "Germany"
system_message = ChatMessage.from_system("You are an assistant giving out valuable information to language learners.")
messages = [system_message, ChatMessage.from_user("What's the official language of {{ country }}?")]

res = pipe.run(data={"prompt_builder": {"template_variables": {"country": country}, "template": messages}})
print(res)


Related Links

Check out the API reference in the GitHub repo or in our docs: