0
# Tool Runner Classes
1
2
Tool runners implement automatic execution loops that handle the request-response cycle with Claude. When Claude requests a tool, runners automatically execute it, send the results back, and continue the conversation until completion. This eliminates the need for manual tool execution loops.
3
4
## Capabilities
5
6
### Synchronous Tool Runner
7
8
Automatic tool execution loop for synchronous tools that iterates through messages until no more tools are requested.
9
10
```python { .api }
11
class BetaToolRunner:
12
"""
13
Synchronous tool runner with automatic execution loop.
14
15
Manages the conversation flow, automatically:
16
- Sending messages to Claude
17
- Detecting tool use requests
18
- Executing tools with provided inputs
19
- Sending tool results back to Claude
20
- Continuing until Claude stops requesting tools
21
22
The runner is an iterator that yields ParsedBetaMessage objects for each
23
turn of the conversation.
24
"""
25
26
def until_done(self) -> ParsedBetaMessage[ResponseFormatT]:
27
"""
28
Consume the entire tool execution loop and return the final message.
29
30
This is the most common usage pattern. It runs the loop until Claude
31
stops requesting tools and returns the final response.
32
33
Returns:
34
The final message from Claude after all tools have been executed
35
"""
36
37
def generate_tool_call_response(self) -> BetaMessageParam | None:
38
"""
39
Generate a message containing tool results from the last assistant message.
40
41
Examines tool use blocks in the last message, executes the requested
42
tools, and formats results as a user message. Results are cached, so
43
repeated calls return the same response.
44
45
Returns:
46
Message parameter with tool results, or None if no tools were called
47
48
Note:
49
Tool errors are caught and returned as error results rather than
50
raising exceptions.
51
"""
52
53
def set_messages_params(
54
self,
55
params: ParseMessageCreateParamsBase[ResponseFormatT]
56
| Callable[[ParseMessageCreateParamsBase[ResponseFormatT]], ParseMessageCreateParamsBase[ResponseFormatT]]
57
) -> None:
58
"""
59
Update parameters for the next API call.
60
61
This invalidates any cached tool responses.
62
63
Args:
64
params: Either new parameters or a function to transform existing parameters
65
"""
66
67
def append_messages(self, *messages: BetaMessageParam | ParsedBetaMessage[ResponseFormatT]) -> None:
68
"""
69
Add one or more messages to the conversation history.
70
71
This invalidates cached tool responses, causing tools to be executed
72
again on the next iteration.
73
74
Args:
75
messages: One or more messages to append to the conversation
76
"""
77
78
def __iter__(self) -> Iterator[ParsedBetaMessage[ResponseFormatT]]:
79
"""
80
Iterate over messages as the tool loop executes.
81
82
Yields:
83
A ParsedBetaMessage for each turn of the conversation
84
"""
85
```
86
87
#### Usage Examples
88
89
**Basic tool runner usage:**
90
91
```python
92
from anthropic import Anthropic, beta_tool
93
94
@beta_tool
95
def get_temperature(city: str) -> str:
96
"""Get current temperature for a city.
97
98
Args:
99
city: The city name
100
"""
101
# Simulated weather API
102
temps = {"Tokyo": "18°C", "London": "12°C", "New York": "15°C"}
103
return temps.get(city, "Unknown")
104
105
client = Anthropic()
106
107
# Create and run tool loop
108
runner = client.beta.messages.tool_runner(
109
model="claude-3-5-sonnet-20241022",
110
max_tokens=1024,
111
tools=[get_temperature],
112
messages=[{"role": "user", "content": "What's the temperature in Tokyo?"}]
113
)
114
115
# Get final message after all tool execution completes
116
final_message = runner.until_done()
117
print(final_message.content[0].text)
118
# "The current temperature in Tokyo is 18°C."
119
```
120
121
**Multiple tools with automatic selection:**
122
123
```python
124
@beta_tool
125
def add(x: float, y: float) -> str:
126
"""Add two numbers.
127
128
Args:
129
x: First number
130
y: Second number
131
"""
132
return str(x + y)
133
134
@beta_tool
135
def multiply(x: float, y: float) -> str:
136
"""Multiply two numbers.
137
138
Args:
139
x: First number
140
y: Second number
141
"""
142
return str(x * y)
143
144
runner = client.beta.messages.tool_runner(
145
model="claude-3-5-sonnet-20241022",
146
max_tokens=1024,
147
tools=[add, multiply],
148
messages=[{"role": "user", "content": "What is (5 + 3) * 4?"}]
149
)
150
151
final_message = runner.until_done()
152
# Claude will automatically call add(5, 3), then multiply(8, 4)
153
```
154
155
**Iterating through tool execution steps:**
156
157
```python
158
@beta_tool
159
def search(query: str) -> str:
160
"""Search for information.
161
162
Args:
163
query: Search query
164
"""
165
return f"Results for: {query}"
166
167
runner = client.beta.messages.tool_runner(
168
model="claude-3-5-sonnet-20241022",
169
max_tokens=1024,
170
tools=[search],
171
messages=[{"role": "user", "content": "Find info about Python"}]
172
)
173
174
# Iterate to see each step
175
for message in runner:
176
print(f"Stop reason: {message.stop_reason}")
177
for block in message.content:
178
if block.type == "text":
179
print(f"Text: {block.text}")
180
elif block.type == "tool_use":
181
print(f"Tool call: {block.name}({block.input})")
182
```
183
184
**Manual tool response generation:**
185
186
```python
187
@beta_tool
188
def get_data(id: int) -> str:
189
"""Get data by ID.
190
191
Args:
192
id: Record ID
193
"""
194
return f"Data for ID {id}"
195
196
runner = client.beta.messages.tool_runner(
197
model="claude-3-5-sonnet-20241022",
198
max_tokens=1024,
199
tools=[get_data],
200
messages=[{"role": "user", "content": "Get record 123"}]
201
)
202
203
# Get first message (before tool execution)
204
first_message = next(runner)
205
print(f"Tool requested: {first_message.stop_reason == 'tool_use'}")
206
207
# Generate tool response manually
208
tool_response = runner.generate_tool_call_response()
209
print(tool_response)
210
# {"role": "user", "content": [{"type": "tool_result", "tool_use_id": "...", "content": "Data for ID 123"}]}
211
212
# Continue iteration
213
for message in runner:
214
print(message.content[0].text)
215
```
216
217
**Max iterations limit:**
218
219
```python
220
@beta_tool
221
def recursive_tool(n: int) -> str:
222
"""A tool that might cause infinite loops.
223
224
Args:
225
n: Input number
226
"""
227
return f"Called with {n}"
228
229
runner = client.beta.messages.tool_runner(
230
model="claude-3-5-sonnet-20241022",
231
max_tokens=1024,
232
max_iterations=5, # Stop after 5 iterations
233
tools=[recursive_tool],
234
messages=[{"role": "user", "content": "Keep calling the tool"}]
235
)
236
237
final_message = runner.until_done()
238
# Stops after 5 iterations even if Claude wants to continue
239
```
240
241
**Dynamic message manipulation:**
242
243
```python
244
@beta_tool
245
def check_status() -> str:
246
"""Check system status."""
247
return "System OK"
248
249
runner = client.beta.messages.tool_runner(
250
model="claude-3-5-sonnet-20241022",
251
max_tokens=1024,
252
tools=[check_status],
253
messages=[{"role": "user", "content": "Check status"}]
254
)
255
256
# Get first response
257
first_message = next(runner)
258
259
# Add additional context before continuing
260
runner.append_messages({
261
"role": "user",
262
"content": "Also check the database"
263
})
264
265
# Continue execution with updated context
266
final_message = runner.until_done()
267
```
268
269
### Streaming Tool Runner
270
271
Streaming synchronous tool runner that yields MessageStream objects for each turn of the conversation.
272
273
```python { .api }
274
class BetaStreamingToolRunner:
275
"""
276
Streaming synchronous tool runner that yields MessageStream objects.
277
278
Each iteration yields a stream for one turn of the conversation,
279
allowing you to process streaming responses while maintaining
280
automatic tool execution.
281
"""
282
283
def until_done(self) -> ParsedBetaMessage[ResponseFormatT]:
284
"""
285
Consume the entire tool execution loop and return the final message.
286
287
Returns:
288
The final message from Claude after all tools have been executed
289
"""
290
291
def generate_tool_call_response(self) -> BetaMessageParam | None:
292
"""
293
Generate a message containing tool results from the last assistant message.
294
295
Returns:
296
Message parameter with tool results, or None if no tools were called
297
"""
298
299
def set_messages_params(
300
self,
301
params: ParseMessageCreateParamsBase[ResponseFormatT]
302
| Callable[[ParseMessageCreateParamsBase[ResponseFormatT]], ParseMessageCreateParamsBase[ResponseFormatT]]
303
) -> None:
304
"""
305
Update parameters for the next API call.
306
307
Args:
308
params: Either new parameters or a function to transform existing parameters
309
"""
310
311
def append_messages(self, *messages: BetaMessageParam | ParsedBetaMessage[ResponseFormatT]) -> None:
312
"""
313
Add one or more messages to the conversation history.
314
315
Args:
316
messages: One or more messages to append to the conversation
317
"""
318
319
def __iter__(self) -> Iterator[BetaMessageStream[ResponseFormatT]]:
320
"""
321
Iterate over message streams as the tool loop executes.
322
323
Yields:
324
A BetaMessageStream for each turn of the conversation
325
"""
326
```
327
328
#### Usage Examples
329
330
**Basic streaming tool runner:**
331
332
```python
333
from anthropic import Anthropic, beta_tool
334
335
@beta_tool
336
def search(query: str) -> str:
337
"""Search for information.
338
339
Args:
340
query: The search query
341
"""
342
return f"Found results for: {query}"
343
344
client = Anthropic()
345
346
runner = client.beta.messages.tool_runner(
347
model="claude-3-5-sonnet-20241022",
348
max_tokens=1024,
349
tools=[search],
350
messages=[{"role": "user", "content": "Search for Python tutorials"}],
351
stream=True
352
)
353
354
# Process each streaming response
355
for stream in runner:
356
for event in stream:
357
if event.type == "content_block_delta":
358
if event.delta.type == "text_delta":
359
print(event.delta.text, end="", flush=True)
360
361
print("\n---Turn complete---")
362
```
363
364
**Streaming with progress tracking:**
365
366
```python
367
@beta_tool
368
def analyze_data(dataset: str) -> str:
369
"""Analyze a dataset.
370
371
Args:
372
dataset: Dataset name
373
"""
374
# Simulated analysis
375
return f"Analysis complete for {dataset}"
376
377
runner = client.beta.messages.tool_runner(
378
model="claude-3-5-sonnet-20241022",
379
max_tokens=1024,
380
tools=[analyze_data],
381
messages=[{"role": "user", "content": "Analyze the sales data"}],
382
stream=True
383
)
384
385
turn_number = 0
386
for stream in runner:
387
turn_number += 1
388
print(f"\n=== Turn {turn_number} ===")
389
390
text_content = []
391
for event in stream:
392
if event.type == "content_block_start":
393
if event.content_block.type == "tool_use":
394
print(f"Calling tool: {event.content_block.name}")
395
396
elif event.type == "content_block_delta":
397
if event.delta.type == "text_delta":
398
text_content.append(event.delta.text)
399
print(event.delta.text, end="", flush=True)
400
401
if text_content:
402
print() # Newline after text
403
404
# Get final message
405
final_message = stream.get_final_message()
406
```
407
408
**Streaming with early termination:**
409
410
```python
411
@beta_tool
412
def long_operation(task: str) -> str:
413
"""Perform a long operation.
414
415
Args:
416
task: Task description
417
"""
418
return f"Completed: {task}"
419
420
runner = client.beta.messages.tool_runner(
421
model="claude-3-5-sonnet-20241022",
422
max_tokens=1024,
423
tools=[long_operation],
424
messages=[{"role": "user", "content": "Do something"}],
425
stream=True
426
)
427
428
for stream in runner:
429
word_count = 0
430
for event in stream:
431
if event.type == "content_block_delta":
432
if event.delta.type == "text_delta":
433
words = event.delta.text.split()
434
word_count += len(words)
435
print(event.delta.text, end="", flush=True)
436
437
# Stop if response is too long
438
if word_count > 100:
439
print("\n[Response truncated]")
440
break
441
break # Exit runner early
442
```
443
444
### Asynchronous Tool Runner
445
446
Automatic tool execution loop for asynchronous tools.
447
448
```python { .api }
449
class BetaAsyncToolRunner:
450
"""
451
Asynchronous tool runner with automatic execution loop.
452
453
Async version of BetaToolRunner. All methods are async equivalents.
454
"""
455
456
async def until_done(self) -> ParsedBetaMessage[ResponseFormatT]:
457
"""
458
Consume the entire tool execution loop and return the final message.
459
460
Returns:
461
The final message from Claude after all tools have been executed
462
"""
463
464
async def generate_tool_call_response(self) -> BetaMessageParam | None:
465
"""
466
Generate a message containing tool results from the last assistant message.
467
468
Returns:
469
Message parameter with tool results, or None if no tools were called
470
"""
471
472
def set_messages_params(
473
self,
474
params: ParseMessageCreateParamsBase[ResponseFormatT]
475
| Callable[[ParseMessageCreateParamsBase[ResponseFormatT]], ParseMessageCreateParamsBase[ResponseFormatT]]
476
) -> None:
477
"""
478
Update parameters for the next API call.
479
480
Args:
481
params: Either new parameters or a function to transform existing parameters
482
"""
483
484
def append_messages(self, *messages: BetaMessageParam | ParsedBetaMessage[ResponseFormatT]) -> None:
485
"""
486
Add one or more messages to the conversation history.
487
488
Args:
489
messages: One or more messages to append to the conversation
490
"""
491
492
async def __aiter__(self) -> AsyncIterator[ParsedBetaMessage[ResponseFormatT]]:
493
"""
494
Async iterate over messages as the tool loop executes.
495
496
Yields:
497
A ParsedBetaMessage for each turn of the conversation
498
"""
499
```
500
501
#### Usage Examples
502
503
**Basic async tool runner:**
504
505
```python
506
from anthropic import AsyncAnthropic, beta_async_tool
507
import httpx
508
509
@beta_async_tool
510
async def fetch_url(url: str) -> str:
511
"""Fetch content from a URL.
512
513
Args:
514
url: The URL to fetch
515
"""
516
async with httpx.AsyncClient() as client:
517
response = await client.get(url)
518
return response.text[:200]
519
520
client = AsyncAnthropic()
521
522
runner = client.beta.messages.tool_runner(
523
model="claude-3-5-sonnet-20241022",
524
max_tokens=1024,
525
tools=[fetch_url],
526
messages=[{"role": "user", "content": "Fetch https://example.com"}]
527
)
528
529
# Await final message
530
final_message = await runner.until_done()
531
print(final_message.content[0].text)
532
```
533
534
**Async iteration:**
535
536
```python
537
@beta_async_tool
538
async def async_search(query: str) -> str:
539
"""Search asynchronously.
540
541
Args:
542
query: Search query
543
"""
544
await asyncio.sleep(0.1) # Simulate API call
545
return f"Results for: {query}"
546
547
runner = client.beta.messages.tool_runner(
548
model="claude-3-5-sonnet-20241022",
549
max_tokens=1024,
550
tools=[async_search],
551
messages=[{"role": "user", "content": "Search for async programming"}]
552
)
553
554
# Async iteration
555
async for message in runner:
556
print(f"Turn complete: {message.stop_reason}")
557
if message.stop_reason != "tool_use":
558
print(message.content[0].text)
559
```
560
561
**Multiple async tools:**
562
563
```python
564
@beta_async_tool
565
async def get_weather(city: str) -> str:
566
"""Get weather for a city.
567
568
Args:
569
city: City name
570
"""
571
async with httpx.AsyncClient() as client:
572
response = await client.get(f"https://api.weather.com/{city}")
573
return response.text
574
575
@beta_async_tool
576
async def get_news(topic: str) -> str:
577
"""Get news about a topic.
578
579
Args:
580
topic: News topic
581
"""
582
async with httpx.AsyncClient() as client:
583
response = await client.get(f"https://api.news.com/{topic}")
584
return response.text
585
586
runner = client.beta.messages.tool_runner(
587
model="claude-3-5-sonnet-20241022",
588
max_tokens=1024,
589
tools=[get_weather, get_news],
590
messages=[{
591
"role": "user",
592
"content": "What's the weather in Tokyo and any news about Japan?"
593
}]
594
)
595
596
final_message = await runner.until_done()
597
# Claude will automatically call both tools and synthesize results
598
```
599
600
### Asynchronous Streaming Tool Runner
601
602
Streaming asynchronous tool runner that yields AsyncMessageStream objects.
603
604
```python { .api }
605
class BetaAsyncStreamingToolRunner:
606
"""
607
Streaming asynchronous tool runner that yields AsyncMessageStream objects.
608
609
Async version of BetaStreamingToolRunner. Each iteration yields an async
610
stream for one turn of the conversation.
611
"""
612
613
async def until_done(self) -> ParsedBetaMessage[ResponseFormatT]:
614
"""
615
Consume the entire tool execution loop and return the final message.
616
617
Returns:
618
The final message from Claude after all tools have been executed
619
"""
620
621
async def generate_tool_call_response(self) -> BetaMessageParam | None:
622
"""
623
Generate a message containing tool results from the last assistant message.
624
625
Returns:
626
Message parameter with tool results, or None if no tools were called
627
"""
628
629
def set_messages_params(
630
self,
631
params: ParseMessageCreateParamsBase[ResponseFormatT]
632
| Callable[[ParseMessageCreateParamsBase[ResponseFormatT]], ParseMessageCreateParamsBase[ResponseFormatT]]
633
) -> None:
634
"""
635
Update parameters for the next API call.
636
637
Args:
638
params: Either new parameters or a function to transform existing parameters
639
"""
640
641
def append_messages(self, *messages: BetaMessageParam | ParsedBetaMessage[ResponseFormatT]) -> None:
642
"""
643
Add one or more messages to the conversation history.
644
645
Args:
646
messages: One or more messages to append to the conversation
647
"""
648
649
async def __aiter__(self) -> AsyncIterator[BetaAsyncMessageStream[ResponseFormatT]]:
650
"""
651
Async iterate over message streams as the tool loop executes.
652
653
Yields:
654
A BetaAsyncMessageStream for each turn of the conversation
655
"""
656
```
657
658
#### Usage Examples
659
660
**Basic async streaming:**
661
662
```python
663
from anthropic import AsyncAnthropic, beta_async_tool
664
665
@beta_async_tool
666
async def compute(expression: str) -> str:
667
"""Compute a mathematical expression.
668
669
Args:
670
expression: Math expression to evaluate
671
"""
672
await asyncio.sleep(0.1)
673
return str(eval(expression)) # Don't use eval in production!
674
675
client = AsyncAnthropic()
676
677
runner = client.beta.messages.tool_runner(
678
model="claude-3-5-sonnet-20241022",
679
max_tokens=1024,
680
tools=[compute],
681
messages=[{"role": "user", "content": "What is 25 * 17?"}],
682
stream=True
683
)
684
685
async for stream in runner:
686
async for event in stream:
687
if event.type == "content_block_delta":
688
if event.delta.type == "text_delta":
689
print(event.delta.text, end="", flush=True)
690
print() # Newline between turns
691
```
692
693
**Async streaming with real-time updates:**
694
695
```python
696
@beta_async_tool
697
async def process_task(task: str) -> str:
698
"""Process a task.
699
700
Args:
701
task: Task description
702
"""
703
await asyncio.sleep(1)
704
return f"Processed: {task}"
705
706
runner = client.beta.messages.tool_runner(
707
model="claude-3-5-sonnet-20241022",
708
max_tokens=1024,
709
tools=[process_task],
710
messages=[{"role": "user", "content": "Process the data"}],
711
stream=True
712
)
713
714
async for stream in runner:
715
print(f"Stream started at {datetime.now()}")
716
717
async for event in stream:
718
if event.type == "content_block_start":
719
if event.content_block.type == "tool_use":
720
print(f"Tool starting: {event.content_block.name}")
721
722
elif event.type == "content_block_delta":
723
if event.delta.type == "text_delta":
724
print(event.delta.text, end="", flush=True)
725
726
print(f"\nStream ended at {datetime.now()}")
727
```
728
729
## Types
730
731
### Request Options
732
733
```python { .api }
734
class RequestOptions(TypedDict, total=False):
735
"""
736
Options for API requests made by tool runners.
737
"""
738
extra_headers: Headers | None
739
extra_query: Query | None
740
extra_body: Body | None
741
timeout: float | httpx.Timeout | None | NotGiven
742
```
743
744
Additional options that can be passed to API requests.
745