Skip to main content
Version: 2.23

Get Started

Have a look at this page to learn how to quickly get up and running with Haystack. It contains instructions for installing Haystack, building your first RAG pipeline, and creating a tool-calling Agent.

Build your first RAG application​

Let's build your first Retrieval Augmented Generation (RAG) pipeline and see how Haystack answers questions.

First, install the minimal form of Haystack:

shell
pip install haystack-ai

In the examples below, we show how to set an API key using a Haystack Secret. Choose your preferred LLM provider from the tabs below. For easier use, you can also set the API key as an environment variable.

OpenAIChatGenerator is included in the haystack-ai package.

python
from haystack import Pipeline, Document
from haystack.components.generators.chat import OpenAIChatGenerator
from haystack.components.retrievers import InMemoryBM25Retriever
from haystack.document_stores.in_memory import InMemoryDocumentStore
from haystack.components.builders import ChatPromptBuilder
from haystack.utils import Secret
from haystack.dataclasses import ChatMessage

document_store = InMemoryDocumentStore()
document_store.write_documents([
Document(content="My name is Jean and I live in Paris."),
Document(content="My name is Mark and I live in Berlin."),
Document(content="My name is Giorgio and I live in Rome.")
])

prompt_template = [
ChatMessage.from_system(
"""
Given these documents, answer the question.
Documents:
{% for doc in documents %}
{{ doc.content }}
{% endfor %}
"""
),
ChatMessage.from_user("{{question}}")
]

retriever = InMemoryBM25Retriever(document_store=document_store)
prompt_builder = ChatPromptBuilder(template=prompt_template, required_variables="*")
llm = OpenAIChatGenerator(
api_key=Secret.from_env_var("OPENAI_API_KEY"),
model="gpt-4o-mini"
)

rag_pipeline = Pipeline()
rag_pipeline.add_component("retriever", retriever)
rag_pipeline.add_component("prompt_builder", prompt_builder)
rag_pipeline.add_component("llm", llm)
rag_pipeline.connect("retriever", "prompt_builder.documents")
rag_pipeline.connect("prompt_builder", "llm")

question = "Who lives in Paris?"
results = rag_pipeline.run(
{
"retriever": {"query": question},
"prompt_builder": {"question": question},
}
)

print(results["llm"]["replies"])

Next Steps​

Ready to dive deeper? Check out the Creating Your First QA Pipeline with Retrieval-Augmentation tutorial for a step-by-step guide on building a complete RAG pipeline with your own data.

Build your first Agent​

Agents are AI systems that can use tools to gather information, perform actions, and interact with external systems. Let's build an agent that can search the web to answer questions.

This example requires a SerperDev API key for web search. Set it as the SERPERDEV_API_KEY environment variable.

OpenAIChatGenerator is included in the haystack-ai package.

python
from haystack.components.agents import Agent
from haystack.components.generators.chat import OpenAIChatGenerator
from haystack.dataclasses import ChatMessage
from haystack.tools import ComponentTool
from haystack.components.websearch import SerperDevWebSearch
from haystack.utils import Secret

search_tool = ComponentTool(component=SerperDevWebSearch())

agent = Agent(
chat_generator=OpenAIChatGenerator(
api_key=Secret.from_env_var("OPENAI_API_KEY"),
model="gpt-4o-mini"
),
tools=[search_tool],
system_prompt="You are a helpful assistant that can search the web for information."
)

result = agent.run(messages=[ChatMessage.from_user("What is Haystack AI?")])

print(result["last_message"].text)

Next Steps​

For a hands-on guide on creating a tool-calling agent that can use both components and pipelines as tools, check out the Build a Tool-Calling Agent tutorial.