0
# Prompt Building
1
2
Create and format prompts for language models with dynamic content injection, template rendering, and structured prompt construction. Haystack provides flexible prompt building components for text generation and chat completion models.
3
4
## Capabilities
5
6
### Text Prompt Building
7
8
Build dynamic text prompts with template variables and conditional logic.
9
10
```python { .api }
11
class PromptBuilder:
12
def __init__(
13
self,
14
template: str,
15
required_variables: Optional[List[str]] = None,
16
variables: Optional[List[str]] = None
17
) -> None:
18
"""
19
Initialize prompt builder with Jinja2 template.
20
21
Args:
22
template: Jinja2 template string with placeholder variables
23
required_variables: List of variables that must be provided
24
variables: List of all available template variables
25
"""
26
27
def run(self, **kwargs) -> Dict[str, str]:
28
"""
29
Build prompt by rendering template with provided variables.
30
31
Args:
32
**kwargs: Template variables to inject into the prompt
33
34
Returns:
35
Dictionary with 'prompt' key containing the rendered prompt string
36
"""
37
38
@property
39
def template(self) -> str:
40
"""Get the current template string."""
41
42
def set_template(self, template: str) -> None:
43
"""
44
Update the template string.
45
46
Args:
47
template: New Jinja2 template string
48
"""
49
```
50
51
### Chat Prompt Building
52
53
Build structured chat prompts with message formatting and role management.
54
55
```python { .api }
56
class ChatPromptBuilder:
57
def __init__(
58
self,
59
template: List[ChatMessage],
60
required_variables: Optional[List[str]] = None,
61
variables: Optional[List[str]] = None
62
) -> None:
63
"""
64
Initialize chat prompt builder with message templates.
65
66
Args:
67
template: List of ChatMessage templates with placeholder variables
68
required_variables: List of variables that must be provided
69
variables: List of all available template variables
70
"""
71
72
def run(self, **kwargs) -> Dict[str, List[ChatMessage]]:
73
"""
74
Build chat prompt by rendering message templates.
75
76
Args:
77
**kwargs: Template variables to inject into messages
78
79
Returns:
80
Dictionary with 'prompt' key containing list of rendered ChatMessages
81
"""
82
83
@property
84
def template(self) -> List[ChatMessage]:
85
"""Get the current message templates."""
86
87
def set_template(self, template: List[ChatMessage]) -> None:
88
"""
89
Update the message templates.
90
91
Args:
92
template: New list of ChatMessage templates
93
"""
94
```
95
96
### Answer Building
97
98
Format and structure answers with source citations and metadata.
99
100
```python { .api }
101
class AnswerBuilder:
102
def __init__(
103
self,
104
pattern: Optional[str] = None,
105
reference_pattern: Optional[str] = None
106
) -> None:
107
"""
108
Initialize answer builder.
109
110
Args:
111
pattern: Pattern for formatting the main answer
112
reference_pattern: Pattern for formatting document references
113
"""
114
115
def run(
116
self,
117
query: str,
118
replies: List[str],
119
documents: Optional[List[Document]] = None,
120
pattern: Optional[str] = None,
121
reference_pattern: Optional[str] = None
122
) -> Dict[str, List[GeneratedAnswer]]:
123
"""
124
Build structured answers with citations and references.
125
126
Args:
127
query: Original question or query
128
replies: Generated answer texts from LLM
129
documents: Source documents used for answer generation
130
pattern: Custom formatting pattern for answers
131
reference_pattern: Custom pattern for document references
132
133
Returns:
134
Dictionary with 'answers' key containing list of GeneratedAnswer objects
135
"""
136
```
137
138
## Usage Examples
139
140
### Basic Text Prompt Building
141
142
```python
143
from haystack.components.builders import PromptBuilder
144
145
# Create a prompt template
146
template = """
147
You are a helpful assistant. Answer the following question based on the provided context.
148
149
Context: {{context}}
150
151
Question: {{question}}
152
153
Answer:
154
"""
155
156
# Initialize prompt builder
157
prompt_builder = PromptBuilder(template=template)
158
159
# Build prompt with variables
160
result = prompt_builder.run(
161
context="Python is a high-level programming language known for its simplicity.",
162
question="What is Python?"
163
)
164
165
print(result["prompt"])
166
# Output:
167
# You are a helpful assistant. Answer the following question based on the provided context.
168
#
169
# Context: Python is a high-level programming language known for its simplicity.
170
#
171
# Question: What is Python?
172
#
173
# Answer:
174
```
175
176
### Advanced Template with Conditional Logic
177
178
```python
179
# Template with conditional logic and loops
180
advanced_template = """
181
System: You are an expert {{domain}} assistant.
182
183
{% if context_documents %}
184
Available Context:
185
{% for doc in context_documents %}
186
- {{doc.content}}
187
{% endfor %}
188
{% endif %}
189
190
{% if examples %}
191
Examples:
192
{% for example in examples %}
193
Q: {{example.question}}
194
A: {{example.answer}}
195
{% endfor %}
196
{% endif %}
197
198
User Query: {{user_query}}
199
200
Please provide a {{response_type}} response.
201
"""
202
203
prompt_builder = PromptBuilder(template=advanced_template)
204
205
result = prompt_builder.run(
206
domain="machine learning",
207
context_documents=[
208
{"content": "Neural networks are computational models inspired by biological neural networks."}
209
],
210
examples=[
211
{"question": "What is supervised learning?", "answer": "Learning with labeled training data."}
212
],
213
user_query="Explain deep learning",
214
response_type="detailed"
215
)
216
217
print(result["prompt"])
218
```
219
220
### Chat Prompt Building
221
222
```python
223
from haystack.components.builders import ChatPromptBuilder
224
from haystack.dataclasses import ChatMessage, ChatRole
225
226
# Create chat message templates
227
message_templates = [
228
ChatMessage(
229
content="You are a helpful assistant specialized in {{domain}}. "
230
"Always provide accurate and helpful information.",
231
role=ChatRole.SYSTEM
232
),
233
ChatMessage(
234
content="Context: {{context}}\n\nBased on this context, {{instruction}}",
235
role=ChatRole.USER
236
)
237
]
238
239
# Initialize chat prompt builder
240
chat_prompt_builder = ChatPromptBuilder(template=message_templates)
241
242
# Build chat prompt
243
result = chat_prompt_builder.run(
244
domain="software engineering",
245
context="Python is an interpreted, object-oriented programming language.",
246
instruction="explain the key features of Python"
247
)
248
249
chat_messages = result["prompt"]
250
for message in chat_messages:
251
print(f"{message.role.value}: {message.content}")
252
```
253
254
### Dynamic Chat Conversation Building
255
256
```python
257
# Build conversation with dynamic message history
258
conversation_template = [
259
ChatMessage(
260
content="You are {{assistant_persona}}. "
261
"{% if conversation_style %}Use a {{conversation_style}} tone.{% endif %}",
262
role=ChatRole.SYSTEM
263
)
264
]
265
266
# Add dynamic message history
267
conversation_template.extend([
268
ChatMessage(
269
content="{{msg.content}}",
270
role=ChatRole(msg.role)
271
) for msg in "{{message_history}}"
272
])
273
274
# Add current user message
275
conversation_template.append(
276
ChatMessage(
277
content="{{current_message}}",
278
role=ChatRole.USER
279
)
280
)
281
282
chat_builder = ChatPromptBuilder(template=conversation_template)
283
284
# Build with conversation history
285
result = chat_builder.run(
286
assistant_persona="a friendly coding tutor",
287
conversation_style="encouraging",
288
message_history=[
289
{"content": "I'm learning Python", "role": "user"},
290
{"content": "That's great! Python is an excellent language to start with.", "role": "assistant"}
291
],
292
current_message="Can you explain functions?"
293
)
294
```
295
296
### Answer Building with Citations
297
298
```python
299
from haystack.components.builders import AnswerBuilder
300
from haystack import Document
301
302
# Initialize answer builder
303
answer_builder = AnswerBuilder(
304
pattern="Answer: {answer}\n\nSources: {references}",
305
reference_pattern="[{idx}] {source}: {content}"
306
)
307
308
# Prepare documents and generated replies
309
documents = [
310
Document(content="Python was created by Guido van Rossum in 1991.",
311
meta={"source": "Python History Wiki"}),
312
Document(content="Python is known for its readable syntax and extensive libraries.",
313
meta={"source": "Python Documentation"})
314
]
315
316
replies = ["Python is a programming language created by Guido van Rossum in 1991. It's known for its readable syntax."]
317
318
# Build structured answer
319
result = answer_builder.run(
320
query="Who created Python and when?",
321
replies=replies,
322
documents=documents
323
)
324
325
answer = result["answers"][0]
326
print(f"Query: {answer.query}")
327
print(f"Answer: {answer.data}")
328
print(f"Documents used: {len(answer.documents)}")
329
```
330
331
### RAG Pipeline with Prompt Building
332
333
```python
334
from haystack import Pipeline
335
from haystack.components.builders import PromptBuilder
336
from haystack.components.generators import OpenAIGenerator
337
from haystack.components.retrievers import InMemoryEmbeddingRetriever
338
339
# Create RAG pipeline with prompt building
340
rag_pipeline = Pipeline()
341
342
# RAG prompt template
343
rag_template = """
344
Answer the question based ONLY on the provided context. If the context doesn't contain
345
enough information to answer the question, say "I don't have enough information to answer this question."
346
347
Context:
348
{% for doc in documents %}
349
- {{doc.content}}
350
{% endfor %}
351
352
Question: {{question}}
353
354
Answer:
355
"""
356
357
# Add components
358
rag_pipeline.add_component("retriever", InMemoryEmbeddingRetriever(document_store=document_store, top_k=3))
359
rag_pipeline.add_component("prompt_builder", PromptBuilder(template=rag_template))
360
rag_pipeline.add_component("generator", OpenAIGenerator(model="gpt-3.5-turbo-instruct"))
361
362
# Connect components
363
rag_pipeline.connect("retriever.documents", "prompt_builder.documents")
364
rag_pipeline.connect("prompt_builder.prompt", "generator.prompt")
365
366
# Run RAG pipeline
367
result = rag_pipeline.run({
368
"retriever": {"query_embedding": query_embedding},
369
"prompt_builder": {"question": "What is Python used for?"}
370
})
371
372
print(result["generator"]["replies"][0])
373
```
374
375
### Multi-Turn Chat Pipeline
376
377
```python
378
from haystack.components.builders import ChatPromptBuilder
379
from haystack.components.generators.chat import OpenAIChatGenerator
380
381
# Create multi-turn chat pipeline
382
chat_pipeline = Pipeline()
383
384
# Multi-turn template with memory
385
chat_template = [
386
ChatMessage(
387
content="You are a helpful assistant. Keep track of the conversation context.",
388
role=ChatRole.SYSTEM
389
)
390
]
391
392
# Add conversation history dynamically
393
for msg in "{{conversation_history}}":
394
chat_template.append(
395
ChatMessage(content=msg["content"], role=ChatRole(msg["role"]))
396
)
397
398
# Add current message
399
chat_template.append(
400
ChatMessage(content="{{current_input}}", role=ChatRole.USER)
401
)
402
403
# Add components
404
chat_pipeline.add_component("chat_prompt_builder", ChatPromptBuilder(template=chat_template))
405
chat_pipeline.add_component("chat_generator", OpenAIChatGenerator(model="gpt-3.5-turbo"))
406
407
# Connect components
408
chat_pipeline.connect("chat_prompt_builder.prompt", "chat_generator.messages")
409
410
# Simulate conversation
411
conversation_history = []
412
413
def chat_turn(user_input: str) -> str:
414
result = chat_pipeline.run({
415
"chat_prompt_builder": {
416
"conversation_history": conversation_history,
417
"current_input": user_input
418
}
419
})
420
421
response = result["chat_generator"]["replies"][0]
422
423
# Update conversation history
424
conversation_history.append({"content": user_input, "role": "user"})
425
conversation_history.append({"content": response.content, "role": "assistant"})
426
427
return response.content
428
429
# Use the chat system
430
response1 = chat_turn("My name is Alice")
431
response2 = chat_turn("What's my name?") # Should remember "Alice"
432
```
433
434
### Custom Template Functions
435
436
```python
437
from jinja2 import Environment
438
439
# Create prompt builder with custom filters
440
def truncate_text(text: str, max_length: int = 100) -> str:
441
"""Custom filter to truncate text."""
442
if len(text) <= max_length:
443
return text
444
return text[:max_length-3] + "..."
445
446
def format_documents(documents: List[Document], max_docs: int = 3) -> str:
447
"""Custom function to format documents."""
448
formatted = []
449
for i, doc in enumerate(documents[:max_docs]):
450
formatted.append(f"Document {i+1}: {doc.content}")
451
return "\n".join(formatted)
452
453
# Template using custom functions
454
template_with_functions = """
455
Query: {{query}}
456
457
Context (showing top {{max_docs}} documents):
458
{{documents|format_documents(max_docs)}}
459
460
Each document truncated to {{max_length}} characters:
461
{% for doc in documents %}
462
- {{doc.content|truncate_text(max_length)}}
463
{% endfor %}
464
465
Answer:
466
"""
467
468
prompt_builder = PromptBuilder(template=template_with_functions)
469
470
# Add custom filters to the Jinja2 environment
471
env = Environment()
472
env.filters['truncate_text'] = truncate_text
473
env.filters['format_documents'] = format_documents
474
475
# Use with custom functions
476
result = prompt_builder.run(
477
query="What is machine learning?",
478
documents=documents,
479
max_docs=2,
480
max_length=50
481
)
482
```
483
484
## Types
485
486
```python { .api }
487
from typing import List, Dict, Any, Optional
488
from haystack.dataclasses import ChatMessage, Document, GeneratedAnswer
489
490
class TemplateVariable:
491
name: str
492
type: str
493
required: bool
494
default: Optional[Any]
495
496
class PromptTemplate:
497
content: str
498
variables: List[TemplateVariable]
499
500
class ChatTemplate:
501
messages: List[ChatMessage]
502
variables: List[TemplateVariable]
503
```