Welcome to Haystack 2.0
To skip the introductions and go directly to installing and creating a search app, see Get Started.
Haystack is an end-to-end framework that you can use to build powerful and production-ready Pipelines with Large Language Models (LLMs) for different search use cases. Whether you want to perform retrieval-augmented generation (RAG), question answering, or semantic document search, you can use the state-of-the-art LLMs and NLP models in Haystack to provide custom search experiences and make it possible for your users to query in natural language.
Haystack is built in a modular fashion so that you can combine the best technology from OpenAI, Chroma, Marqo, and other open source projects, like Hugging Face's Transformers or Elasticsearch.
Since Haystack 1.15, we’ve been slowly introducing new components and features to Haystack in the background in preparation for Haystack 2.0. It will be a major update to the design of Haystack components, Document Stores, and Pipelines. We believe that the Pipeline concept is a fundamental requirement and an optimal fit for building applications with LLMs. Therefore, Pipelines and Components will continue to be the foundation of Haystack 2.0. However, the general Pipeline structure, Components API, and the connection between Document Stores and Retrievers will change. So, this will be a breaking change for Haystack 1.x users.
Haystack 2.0 is still a work in progress
We are defining the requirements for a more powerful and robust LLM framework with continuous feedback from the community, and we’re implementing the new Haystack API so that it’s aligned with the advances in NLP.
Updated 4 days ago