SuperComponents
SuperComponent
lets you wrap a complete pipeline and use it like a single component. This is helpful when you want to simplify the interface of a complex pipeline, reuse it in different contexts, or expose only the necessary inputs and outputs.
How It Works
Input Mapping
Map the input names of the SuperComponent
to the actual sockets inside the pipeline.
input_mapping = {
"query": ["retriever.query", "prompt.query"]
}
Output Mapping
Map the pipeline's output sockets that you want to expose to the SuperComponent
's output names.
output_mapping = {
"llm.replies": "replies"
}
If you don’t provide mappings, SuperComponent
will try to auto-detect them. So, if multiple components have outputs with the same name, we recommend using output_mapping
to avoid conflicts.
Example
Here is a simple example of initializing a SuperComponent
with a pipeline:
from haystack import Pipeline, SuperComponent
with open("pipeline.yaml", "r") as file:
pipeline = Pipeline.load(file)
super_component = SuperComponent(pipeline)
The example pipeline below retrieves relevant documents based on a user query, builds a custom prompt using those documents, then sends the prompt to an OpenAIChatGenerator
to create an answer. The SuperComponent
wraps the pipeline so it can be run with a simple input (query
) and returns a clean output (replies
).
from haystack import Pipeline, SuperComponent
from haystack.components.generators.chat import OpenAIChatGenerator
from haystack.components.builders import ChatPromptBuilder
from haystack.components.retrievers import InMemoryBM25Retriever
from haystack.dataclasses.chat_message import ChatMessage
from haystack.document_stores.in_memory import InMemoryDocumentStore
from haystack.dataclasses import Document
document_store = InMemoryDocumentStore()
documents = [
Document(content="Paris is the capital of France."),
Document(content="London is the capital of England."),
]
document_store.write_documents(documents)
prompt_template = [
ChatMessage.from_user(
'''
According to the following documents:
{% for document in documents %}
{{document.content}}
{% endfor %}
Answer the given question: {{query}}
Answer:
'''
)
]
prompt_builder = ChatPromptBuilder(template=prompt_template, required_variables="*")
pipeline = Pipeline()
pipeline.add_component("retriever", InMemoryBM25Retriever(document_store=document_store))
pipeline.add_component("prompt_builder", prompt_builder)
pipeline.add_component("llm", OpenAIChatGenerator())
pipeline.connect("retriever.documents", "prompt_builder.documents")
pipeline.connect("prompt_builder.prompt", "llm.messages")
# Create a super component with simplified input/output mapping
wrapper = SuperComponent(
pipeline=pipeline,
input_mapping={
"query": ["retriever.query", "prompt_builder.query"],
},
output_mapping={
"llm.replies": "replies",
"retriever.documents": "documents"
}
)
# Run the pipeline with simplified interface
result = wrapper.run(query="What is the capital of France?")
print(result)
{'replies': [ChatMessage(_role=<ChatRole.ASSISTANT: 'assistant'>,
_content=[TextContent(text='The capital of France is Paris.')],...)
Updated 1 day ago