VertexAIGeminiChatGenerator
VertexAIGeminiChatGenerator
enables chat completion using Google Gemini models.
Most common position in a pipeline | After a ChatPromptBuilder |
Mandatory run variables | “messages”: A list of ChatMessage objects representing the chat |
Output variables | “replies”: A list of alternative replies of the model to the input chat |
API reference | Google Vertex |
GitHub link | https://github.com/deepset-ai/haystack-core-integrations/tree/main/integrations/google_vertex |
VertexAIGeminiChatGenerator
supports both gemini-pro
and gemini-1.5-flash
models.
For available models, see https://cloud.google.com/vertex-ai/generative-ai/docs/learn/models.
To explore the full capabilities of Gemini check out this article and the related 🧑🍳 Cookbook.
Parameters Overview
VertexAIGeminiChatGenerator
uses Google Cloud Application Default Credentials (ADCs) for authentication. For more information on how to set up ADCs, see the official documentation.
Keep in mind that it’s essential to use an account that has access to a project authorized to use Google Vertex AI endpoints.
You can find your project ID in the GCP resource manager or locally by running gcloud projects list
in your terminal. For more info on the gcloud CLI, see its official documentation.
Streaming
This Generator supports streaming the tokens from the LLM directly in output. To do so, pass a function to the streaming_callback
init parameter.
Usage
You need to install the google-vertex-haystack
package to use the VertexAIGeminiChatGenerator
:
pip install google-vertex-haystack
On its own
Basic usage:
from haystack.dataclasses import ChatMessage
from haystack_integrations.components.generators.google_vertex import VertexAIGeminiChatGenerator
gemini_chat = VertexAIGeminiChatGenerator()
messages = [ChatMessage.from_user("Tell me the name of a movie")]
res = gemini_chat.run(messages)
print(res["replies"][0].content)
>>> The Shawshank Redemption
messages += [res["replies"][0], ChatMessage.from_user("Who's the main actor?")]
res = gemini_chat.run(messages)
print(res["replies"][0].content)
>>> Tim Robbins
When chatting with Gemini Pro, you can also easily use function calls. We first need to define our function locally:
from vertexai.preview.generative_models import Tool, FunctionDeclaration
get_current_weather_func = FunctionDeclaration(
name="get_current_weather",
description="Get the current weather in a given location",
parameters={
"type": "object",
"properties": {
"location": {"type": "string", "description": "The city and state, e.g. San Francisco, CA"},
"unit": {
"type": "string",
"enum": [
"celsius",
"fahrenheit",
],
},
},
"required": ["location"],
},
)
tool = Tool([get_current_weather_func])
For demonstration purposes, we’re also creating a function that will always tell us the same thing:
def get_current_weather(location: str, unit: str = "celsius"):
return {"weather": "sunny", "temperature": 21.8, "unit": unit}
We create a new instance of VertexAIGeminiChatGenerator
to set our tools:
from haystack_integrations.components.generators.google_vertex import VertexAIGeminiChatGenerator
gemini_chat = VertexAIGeminiChatGenerator(tools=[tool])
And then ask our question:
from haystack.dataclasses import ChatMessage
messages = [ChatMessage.from_user("What is the temperature in celsius in Berlin?")]
res = gemini_chat.run(messages=messages)
print(res["replies"][0].content)
>>> {'location': 'Berlin, Germany', 'unit': 'metric'}
weather = get_current_weather(**res["replies"][0].content)
messages += res["replies"] + [ChatMessage.from_function(content=weather, name="get_current_weather")]
res = gemini_chat.run(messages=messages)
print(res["replies"][0].content)
>>> 21.8 Celsius
In a pipeline
from haystack.components.builders import ChatPromptBuilder
from haystack.dataclasses import ChatMessage
from haystack import Pipeline
from haystack_integrations.components.generators.google_vertex import VertexAIGeminiChatGenerator
# no parameter init, we don't use any runtime template variables
prompt_builder = ChatPromptBuilder()
gemini_chat = VertexAIGeminiChatGenerator()
pipe = Pipeline()
pipe.add_component("prompt_builder", prompt_builder)
pipe.add_component("gemini", gemini)
pipe.connect("prompt_builder.prompt", "gemini.messages")
location = "Rome"
messages = [ChatMessage.from_user("Tell me briefly about {{location}} history")]
res = pipe.run(data={"prompt_builder": {"template_variables":{"location": location}, "template": messages}})
print(res)
>>> - **753 B.C.:** Traditional date of the founding of Rome by Romulus and Remus.
>>> - **509 B.C.:** Establishment of the Roman Republic, replacing the Etruscan monarchy.
>>> - **492-264 B.C.:** Series of wars against neighboring tribes, resulting in the expansion of the Roman Republic's territory.
>>> - **264-146 B.C.:** Three Punic Wars against Carthage, resulting in the destruction of Carthage and the Roman Republic becoming the dominant power in the Mediterranean.
>>> - **133-73 B.C.:** Series of civil wars and slave revolts, leading to the rise of Julius Caesar.
>>> - **49 B.C.:** Julius Caesar crosses the Rubicon River, starting the Roman Civil War.
>>> - **44 B.C.:** Julius Caesar is assassinated, leading to the Second Triumvirate of Octavian, Mark Antony, and Lepidus.
>>> - **31 B.C.:** Battle of Actium, where Octavian defeats Mark Antony and Cleopatra, becoming the sole ruler of Rome.
>>> - **27 B.C.:** The Roman Republic is transformed into the Roman Empire, with Octavian becoming the first Roman emperor, known as Augustus.
>>> - **1st century A.D.:** The Roman Empire reaches its greatest extent, stretching from Britain to Egypt.
>>> - **3rd century A.D.:** The Roman Empire begins to decline, facing internal instability, invasions by Germanic tribes, and the rise of Christianity.
>>> - **476 A.D.:** The last Western Roman emperor, Romulus Augustulus, is overthrown by the Germanic leader Odoacer, marking the end of the Roman Empire in the West.
Additional References
🧑🍳 Cookbook: Function Calling and Multimodal QA with Gemini
Updated 2 months ago