0
# Prompts
1
2
Flexible prompt templating system supporting chat templates, conditional prompts, and integration with various LLM formats. The prompt system provides powerful abstractions for creating reusable, parameterized prompts with support for different message roles and dynamic content generation.
3
4
## Capabilities
5
6
### Base Prompt Interface
7
8
Foundation interface for all prompt template implementations with standardized formatting and validation.
9
10
```python { .api }
11
class BasePromptTemplate:
12
"""
13
Base interface for prompt template implementations.
14
15
Parameters:
16
- metadata: Optional[dict], metadata about the prompt template
17
- template_vars: Optional[List[str]], list of template variable names
18
- function_mappings: Optional[dict], mappings for function-based variables
19
"""
20
def __init__(
21
self,
22
metadata: Optional[dict] = None,
23
template_vars: Optional[List[str]] = None,
24
function_mappings: Optional[dict] = None,
25
**kwargs
26
): ...
27
28
def format(self, **kwargs: Any) -> str:
29
"""
30
Format the prompt template with provided variables.
31
32
Parameters:
33
- **kwargs: template variables and their values
34
35
Returns:
36
- str, formatted prompt string
37
"""
38
39
def format_messages(self, **kwargs: Any) -> List[ChatMessage]:
40
"""
41
Format template as list of chat messages.
42
43
Parameters:
44
- **kwargs: template variables and their values
45
46
Returns:
47
- List[ChatMessage], formatted messages for chat models
48
"""
49
50
def get_template(self) -> str:
51
"""Get the raw template string."""
52
53
def partial_format(self, **kwargs: Any) -> "BasePromptTemplate":
54
"""
55
Create partially formatted template with some variables filled.
56
57
Parameters:
58
- **kwargs: subset of template variables to fill
59
60
Returns:
61
- BasePromptTemplate, partially formatted template
62
"""
63
64
@property
65
def template_vars(self) -> List[str]:
66
"""Get list of template variable names."""
67
```
68
69
### Standard Prompt Templates
70
71
Core prompt template implementation for text-based prompts with variable substitution.
72
73
```python { .api }
74
class PromptTemplate(BasePromptTemplate):
75
"""
76
Standard prompt template for text-based prompts with variable substitution.
77
78
Parameters:
79
- template: str, the prompt template string with {variable} placeholders
80
- template_var_mappings: Optional[dict], mappings for template variable names
81
- function_mappings: Optional[dict], function-based variable mappings
82
- output_parser: Optional[BaseOutputParser], parser for template output
83
"""
84
def __init__(
85
self,
86
template: str,
87
template_var_mappings: Optional[dict] = None,
88
function_mappings: Optional[dict] = None,
89
output_parser: Optional[BaseOutputParser] = None,
90
**kwargs
91
): ...
92
93
@classmethod
94
def from_langchain_prompt(cls, prompt: Any) -> "PromptTemplate":
95
"""Create PromptTemplate from Langchain prompt."""
96
97
# Alias for backward compatibility
98
Prompt = PromptTemplate
99
```
100
101
### Chat Prompt Templates
102
103
Specialized templates for chat-based interactions with support for different message roles and conversation flows.
104
105
```python { .api }
106
class ChatPromptTemplate(BasePromptTemplate):
107
"""
108
Chat-based prompt template supporting multiple message roles and conversation structure.
109
110
Parameters:
111
- message_templates: List[ChatMessage], list of message templates
112
- system_template: Optional[ChatMessage], system message template
113
- template_var_mappings: Optional[dict], variable name mappings
114
- function_mappings: Optional[dict], function-based mappings
115
- output_parser: Optional[BaseOutputParser], output parser for responses
116
"""
117
def __init__(
118
self,
119
message_templates: List[ChatMessage],
120
system_template: Optional[ChatMessage] = None,
121
template_var_mappings: Optional[dict] = None,
122
function_mappings: Optional[dict] = None,
123
output_parser: Optional[BaseOutputParser] = None,
124
**kwargs
125
): ...
126
127
@classmethod
128
def from_messages(cls, messages: List[ChatMessage]) -> "ChatPromptTemplate":
129
"""Create ChatPromptTemplate from list of messages."""
130
131
def format_messages(self, **kwargs: Any) -> List[ChatMessage]:
132
"""Format template into chat messages."""
133
```
134
135
### Conditional & Selector Templates
136
137
Advanced templates with conditional logic and dynamic template selection based on runtime conditions.
138
139
```python { .api }
140
class SelectorPromptTemplate(BasePromptTemplate):
141
"""
142
Prompt template with conditional logic for dynamic template selection.
143
144
Parameters:
145
- default_template: BasePromptTemplate, default template when no conditions match
146
- conditionals: List[Tuple[Callable, BasePromptTemplate]], condition-template pairs
147
"""
148
def __init__(
149
self,
150
default_template: BasePromptTemplate,
151
conditionals: Optional[List[Tuple[Callable, BasePromptTemplate]]] = None,
152
**kwargs
153
): ...
154
155
def select(self, **kwargs: Any) -> BasePromptTemplate:
156
"""
157
Select appropriate template based on conditions.
158
159
Parameters:
160
- **kwargs: variables for condition evaluation
161
162
Returns:
163
- BasePromptTemplate, selected template based on conditions
164
"""
165
```
166
167
### Integration Templates
168
169
Templates for integrating with external prompt systems and frameworks.
170
171
```python { .api }
172
class LangchainPromptTemplate(BasePromptTemplate):
173
"""
174
Wrapper for Langchain prompt templates to provide LlamaIndex compatibility.
175
176
Parameters:
177
- prompt: Any, Langchain prompt template object
178
- template_var_mappings: Optional[dict], variable name mappings
179
- function_mappings: Optional[dict], function mappings
180
- output_parser: Optional[BaseOutputParser], output parser
181
"""
182
def __init__(
183
self,
184
prompt: Any,
185
template_var_mappings: Optional[dict] = None,
186
function_mappings: Optional[dict] = None,
187
output_parser: Optional[BaseOutputParser] = None,
188
**kwargs
189
): ...
190
191
class RichPromptTemplate(BasePromptTemplate):
192
"""
193
Rich text prompt template with advanced formatting capabilities.
194
195
Parameters:
196
- template: str, rich text template with formatting markup
197
- template_var_mappings: Optional[dict], variable mappings
198
- function_mappings: Optional[dict], function mappings
199
"""
200
def __init__(
201
self,
202
template: str,
203
template_var_mappings: Optional[dict] = None,
204
function_mappings: Optional[dict] = None,
205
**kwargs
206
): ...
207
```
208
209
### Message Types & Roles
210
211
Structured message types for chat-based interactions with comprehensive role support.
212
213
```python { .api }
214
class ChatMessage:
215
"""
216
Individual message in a chat conversation.
217
218
Parameters:
219
- role: MessageRole, role of the message sender
220
- content: Union[str, List], message content (text or structured content)
221
- additional_kwargs: Optional[dict], additional message metadata
222
- tool_calls: Optional[List], tool calls in the message
223
- tool_call_id: Optional[str], identifier for tool call responses
224
"""
225
def __init__(
226
self,
227
role: MessageRole,
228
content: Union[str, List] = "",
229
additional_kwargs: Optional[dict] = None,
230
tool_calls: Optional[List] = None,
231
tool_call_id: Optional[str] = None,
232
**kwargs
233
): ...
234
235
@classmethod
236
def from_str(
237
cls,
238
content: str,
239
role: str = MessageRole.USER,
240
**kwargs
241
) -> "ChatMessage":
242
"""Create ChatMessage from string content."""
243
244
class MessageRole(str, Enum):
245
"""Enumeration of message roles in chat conversations."""
246
SYSTEM = "system" # System instructions and context setting
247
USER = "user" # User input and questions
248
ASSISTANT = "assistant" # AI assistant responses
249
FUNCTION = "function" # Function call results (deprecated)
250
TOOL = "tool" # Tool execution results
251
```
252
253
### Prompt Types & Categories
254
255
Categorization system for different types of prompts and their intended usage patterns.
256
257
```python { .api }
258
class PromptType(str, Enum):
259
"""Enumeration of prompt types for categorization and selection."""
260
QUESTION_ANSWER = "question_answer"
261
REFINE = "refine"
262
SUMMARY = "summary"
263
SIMPLE_INPUT = "simple_input"
264
CONDITIONAL_INPUT = "conditional_input"
265
KEYWORD_EXTRACT = "keyword_extract"
266
QUERY_KEYWORD_EXTRACT = "query_keyword_extract"
267
SCHEMA_EXTRACT = "schema_extract"
268
TEXT_TO_SQL = "text_to_sql"
269
TABLE_CONTEXT = "table_context"
270
KNOWLEDGE_TRIPLET_EXTRACT = "knowledge_triplet_extract"
271
TREE_SUMMARIZE = "tree_summarize"
272
TREE_INSERT = "tree_insert"
273
TREE_SELECT = "tree_select"
274
TREE_SELECT_MULTIPLE = "tree_select_multiple"
275
SUB_QUESTION = "sub_question"
276
PANDAS = "pandas"
277
JSON_PATH = "json_path"
278
CHOICE_SELECT = "choice_select"
279
MULTI_SELECT = "multi_select"
280
SINGLE_SELECT = "single_select"
281
```
282
283
### Prompt Display & Utilities
284
285
Utilities for displaying and debugging prompt templates and their formatted output.
286
287
```python { .api }
288
def display_prompt_dict(prompts_dict: Dict[str, BasePromptTemplate]) -> None:
289
"""
290
Display a dictionary of prompts in a formatted way for debugging.
291
292
Parameters:
293
- prompts_dict: Dict[str, BasePromptTemplate], dictionary of prompt templates
294
"""
295
```
296
297
## Usage Examples
298
299
### Basic Prompt Template
300
301
```python
302
from llama_index.core.prompts import PromptTemplate
303
304
# Create a simple prompt template
305
template = PromptTemplate(
306
template="Explain the concept of {topic} in {style} style for {audience}."
307
)
308
309
# Format the prompt
310
formatted_prompt = template.format(
311
topic="machine learning",
312
style="simple",
313
audience="beginners"
314
)
315
316
print(formatted_prompt)
317
# Output: "Explain the concept of machine learning in simple style for beginners."
318
319
# Check template variables
320
print(f"Template variables: {template.template_vars}")
321
# Output: ['topic', 'style', 'audience']
322
```
323
324
### Chat Prompt Template
325
326
```python
327
from llama_index.core.prompts import ChatPromptTemplate
328
from llama_index.core.prompts.types import ChatMessage, MessageRole
329
330
# Create chat messages
331
messages = [
332
ChatMessage(
333
role=MessageRole.SYSTEM,
334
content="You are a helpful AI assistant specializing in {domain}."
335
),
336
ChatMessage(
337
role=MessageRole.USER,
338
content="I have a question about {topic}. Can you help me understand {specific_question}?"
339
)
340
]
341
342
# Create chat prompt template
343
chat_template = ChatPromptTemplate.from_messages(messages)
344
345
# Format chat messages
346
formatted_messages = chat_template.format_messages(
347
domain="machine learning",
348
topic="neural networks",
349
specific_question="how backpropagation works"
350
)
351
352
for msg in formatted_messages:
353
print(f"{msg.role}: {msg.content}")
354
```
355
356
### Conditional Prompt Selection
357
358
```python
359
from llama_index.core.prompts import SelectorPromptTemplate, PromptTemplate
360
361
# Define templates for different scenarios
362
simple_template = PromptTemplate(
363
template="Provide a brief explanation of {concept}."
364
)
365
366
detailed_template = PromptTemplate(
367
template="Provide a comprehensive explanation of {concept}, including examples, applications, and technical details."
368
)
369
370
# Define condition function
371
def is_detailed_request(detail_level: str, **kwargs) -> bool:
372
return detail_level.lower() in ["detailed", "comprehensive", "advanced"]
373
374
# Create selector template
375
selector = SelectorPromptTemplate(
376
default_template=simple_template,
377
conditionals=[
378
(is_detailed_request, detailed_template)
379
]
380
)
381
382
# Format with different detail levels
383
simple_prompt = selector.format(concept="neural networks", detail_level="basic")
384
detailed_prompt = selector.format(concept="neural networks", detail_level="detailed")
385
386
print("Simple:", simple_prompt)
387
print("Detailed:", detailed_prompt)
388
```
389
390
### Multi-turn Conversation Template
391
392
```python
393
# Create a conversation template
394
conversation_template = ChatPromptTemplate.from_messages([
395
ChatMessage(
396
role=MessageRole.SYSTEM,
397
content="You are an expert in {subject}. Answer questions clearly and provide examples when helpful."
398
),
399
ChatMessage(
400
role=MessageRole.USER,
401
content="{user_question}"
402
),
403
ChatMessage(
404
role=MessageRole.ASSISTANT,
405
content="I'll help you understand {subject}. {context_info}"
406
),
407
ChatMessage(
408
role=MessageRole.USER,
409
content="Can you explain {follow_up_topic} in more detail?"
410
)
411
])
412
413
# Format the conversation
414
conversation = conversation_template.format_messages(
415
subject="machine learning",
416
user_question="What is supervised learning?",
417
context_info="Supervised learning uses labeled data to train models.",
418
follow_up_topic="the difference between classification and regression"
419
)
420
421
print("Conversation flow:")
422
for i, msg in enumerate(conversation):
423
print(f"{i+1}. {msg.role.upper()}: {msg.content}")
424
```
425
426
### Partial Template Formatting
427
428
```python
429
# Create template with multiple variables
430
analysis_template = PromptTemplate(
431
template="Analyze {data_type} data from {source} using {method} approach. Focus on {aspect} and provide {output_format} results."
432
)
433
434
# Partially format with some variables
435
partial_template = analysis_template.partial_format(
436
method="statistical",
437
output_format="detailed"
438
)
439
440
# Complete formatting later
441
final_prompt = partial_template.format(
442
data_type="customer behavior",
443
source="web analytics",
444
aspect="conversion patterns"
445
)
446
447
print(final_prompt)
448
```
449
450
### Function-based Variable Mapping
451
452
```python
453
from datetime import datetime
454
455
def get_current_date():
456
return datetime.now().strftime("%Y-%m-%d")
457
458
def get_greeting(time_of_day):
459
greetings = {
460
"morning": "Good morning",
461
"afternoon": "Good afternoon",
462
"evening": "Good evening"
463
}
464
return greetings.get(time_of_day, "Hello")
465
466
# Template with function mappings
467
dynamic_template = PromptTemplate(
468
template="{greeting}! Today is {current_date}. Let's discuss {topic}.",
469
function_mappings={
470
"current_date": get_current_date,
471
"greeting": lambda: get_greeting("morning")
472
}
473
)
474
475
# Format with dynamic functions
476
formatted = dynamic_template.format(topic="AI developments")
477
print(formatted)
478
```
479
480
### Complex Chat Template with Tools
481
482
```python
483
# Advanced chat template with tool integration
484
tool_chat_template = ChatPromptTemplate.from_messages([
485
ChatMessage(
486
role=MessageRole.SYSTEM,
487
content="You are an AI assistant with access to tools. Use the {available_tools} when needed to answer questions about {domain}."
488
),
489
ChatMessage(
490
role=MessageRole.USER,
491
content="{user_query}"
492
),
493
ChatMessage(
494
role=MessageRole.ASSISTANT,
495
content="I'll help you with {user_query}. Let me use the appropriate tools to gather information."
496
)
497
])
498
499
# Format with tool information
500
tool_conversation = tool_chat_template.format_messages(
501
available_tools="search, calculator, and code_executor tools",
502
domain="data science",
503
user_query="How do I calculate the correlation between two datasets?"
504
)
505
506
for msg in tool_conversation:
507
print(f"{msg.role}: {msg.content}")
508
```
509
510
### Template Validation and Debugging
511
512
```python
513
# Validate template variables
514
template = PromptTemplate(
515
template="Generate a {output_type} about {subject} with {requirements}."
516
)
517
518
# Check what variables are needed
519
required_vars = template.template_vars
520
print(f"Required variables: {required_vars}")
521
522
# Attempt formatting with missing variables
523
try:
524
template.format(subject="climate change")
525
except KeyError as e:
526
print(f"Missing variable: {e}")
527
528
# Proper formatting
529
complete_prompt = template.format(
530
output_type="report",
531
subject="climate change",
532
requirements="scientific citations and data visualizations"
533
)
534
print(complete_prompt)
535
```
536
537
### Display and Debug Prompts
538
539
```python
540
from llama_index.core.prompts import display_prompt_dict
541
542
# Collection of prompts for an application
543
prompts = {
544
"question_answer": PromptTemplate(
545
template="Answer the following question: {question}"
546
),
547
"summarization": PromptTemplate(
548
template="Summarize the following text in {length} sentences: {text}"
549
),
550
"classification": ChatPromptTemplate.from_messages([
551
ChatMessage(role=MessageRole.SYSTEM, content="You are a text classifier."),
552
ChatMessage(role=MessageRole.USER, content="Classify this text: {text}")
553
])
554
}
555
556
# Display all prompts for debugging
557
display_prompt_dict(prompts)
558
```
559
560
## Template Library
561
562
Common prompt patterns and templates for typical LLM applications:
563
564
```python { .api }
565
# Question-Answering Template
566
QA_TEMPLATE = PromptTemplate(
567
template="Context information is below.\n"
568
"---------------------\n"
569
"{context_str}\n"
570
"---------------------\n"
571
"Given the context information and not prior knowledge, "
572
"answer the question: {query_str}\n"
573
)
574
575
# Refinement Template
576
REFINE_TEMPLATE = PromptTemplate(
577
template="The original question is as follows: {query_str}\n"
578
"We have provided an existing answer: {existing_answer}\n"
579
"We have the opportunity to refine the existing answer "
580
"(only if needed) with some more context below.\n"
581
"------------\n"
582
"{context_msg}\n"
583
"------------\n"
584
"Given the new context, refine the original answer to better "
585
"answer the question. If the context isn't useful, return "
586
"the original answer.\n"
587
)
588
589
# Summarization Template
590
SUMMARIZE_TEMPLATE = PromptTemplate(
591
template="Write a summary of the following. Try to use only the "
592
"information provided. Try to include as many key details as possible.\n"
593
"\n"
594
"{context_str}\n"
595
"\n"
596
"SUMMARY:"
597
)
598
```
599
600
## Types & Configuration
601
602
```python { .api }
603
# Type alias for prompt template dictionaries
604
PromptMixinType = Dict[str, BasePromptTemplate]
605
606
# Template variable types
607
TemplateVarType = Union[str, Callable[[], str]]
608
609
# Default template delimiters
610
DEFAULT_TEMPLATE_VAR_FORMAT = "{{{var}}}"
611
CHAT_TEMPLATE_VAR_FORMAT = "{var}"
612
613
# Validation settings
614
VALIDATE_TEMPLATE_VARS = True
615
ALLOW_UNDEFINED_TEMPLATE_VARS = False
616
```