pypi-anthropic

Description
The official Python library for the anthropic API
Author
tessl
Last updated

How to use

npx @tessl/cli registry install tessl/pypi-anthropic@0.66.0

index.md docs/

1
# Anthropic Python SDK
2
3
The official Python library for the Anthropic API, providing convenient access to Claude AI models. This library offers both synchronous and asynchronous clients with comprehensive type definitions, streaming responses, message batching, tool use capabilities, and specialized integrations for AWS Bedrock and Google Vertex AI.
4
5
## Package Information
6
7
- **Package Name**: anthropic
8
- **Language**: Python
9
- **Installation**: `pip install anthropic`
10
- **Optional Dependencies**: `pip install anthropic[aiohttp]` for aiohttp support, `pip install anthropic[bedrock]` for AWS Bedrock, `pip install anthropic[vertex]` for Google Vertex AI
11
12
## Core Imports
13
14
```python
15
from anthropic import Anthropic, AsyncAnthropic
16
```
17
18
For specialized integrations:
19
20
```python
21
from anthropic import AnthropicBedrock, AsyncAnthropicBedrock
22
from anthropic import AnthropicVertex, AsyncAnthropicVertex
23
```
24
25
## Basic Usage
26
27
### Synchronous Client
28
29
```python
30
import os
31
from anthropic import Anthropic
32
33
client = Anthropic(
34
api_key=os.environ.get("ANTHROPIC_API_KEY"),
35
)
36
37
message = client.messages.create(
38
max_tokens=1024,
39
messages=[
40
{
41
"role": "user",
42
"content": "Hello, Claude",
43
}
44
],
45
model="claude-sonnet-4-20250514",
46
)
47
print(message.content)
48
```
49
50
### Asynchronous Client
51
52
```python
53
import os
54
import asyncio
55
from anthropic import AsyncAnthropic
56
57
client = AsyncAnthropic(
58
api_key=os.environ.get("ANTHROPIC_API_KEY"),
59
)
60
61
async def main():
62
message = await client.messages.create(
63
max_tokens=1024,
64
messages=[
65
{
66
"role": "user",
67
"content": "Hello, Claude",
68
}
69
],
70
model="claude-sonnet-4-20250514",
71
)
72
print(message.content)
73
74
asyncio.run(main())
75
```
76
77
## Architecture
78
79
The SDK follows a resource-based API design pattern:
80
81
- **Client**: Main entry point (`Anthropic`, `AsyncAnthropic`) that manages authentication and HTTP configuration
82
- **Resources**: API endpoints organized by functionality (`messages`, `completions`, `models`, `beta`)
83
- **Types**: Comprehensive Pydantic models for all request parameters and response objects
84
- **Streaming**: Real-time response processing with event-based message streaming
85
- **Integrations**: Specialized clients for cloud platforms (Bedrock, Vertex AI)
86
87
## Capabilities
88
89
### Messages API
90
91
Core conversational interface for interacting with Claude models, supporting multi-turn conversations, system prompts, tool use, streaming responses, and message batching.
92
93
```python { .api }
94
def create(
95
max_tokens: int,
96
messages: List[MessageParam],
97
model: str,
98
*,
99
metadata: Optional[MetadataParam] = None,
100
service_tier: Optional[Literal["auto", "standard_only"]] = None,
101
stop_sequences: Optional[List[str]] = None,
102
stream: Optional[bool] = None,
103
system: Optional[str] = None,
104
temperature: Optional[float] = None,
105
thinking: Optional[ThinkingConfigParam] = None,
106
tool_choice: Optional[ToolChoiceParam] = None,
107
tools: Optional[List[ToolParam]] = None,
108
top_k: Optional[int] = None,
109
top_p: Optional[float] = None,
110
**kwargs
111
) -> Message: ...
112
113
async def create(
114
max_tokens: int,
115
messages: List[MessageParam],
116
model: str,
117
*,
118
metadata: Optional[MetadataParam] = None,
119
service_tier: Optional[Literal["auto", "standard_only"]] = None,
120
stop_sequences: Optional[List[str]] = None,
121
stream: Optional[bool] = None,
122
system: Optional[str] = None,
123
temperature: Optional[float] = None,
124
thinking: Optional[ThinkingConfigParam] = None,
125
tool_choice: Optional[ToolChoiceParam] = None,
126
tools: Optional[List[ToolParam]] = None,
127
top_k: Optional[int] = None,
128
top_p: Optional[float] = None,
129
**kwargs
130
) -> Message: ...
131
132
def count_tokens(
133
messages: List[MessageParam],
134
model: str,
135
*,
136
system: Optional[str] = None,
137
tool_choice: Optional[ToolChoiceParam] = None,
138
tools: Optional[List[ToolParam]] = None,
139
**kwargs
140
) -> MessageTokensCount: ...
141
```
142
143
[Messages API](./messages.md)
144
145
### Text Completions API
146
147
Direct text completion interface for generating text from prompts, primarily used for specific use cases requiring the legacy completion format.
148
149
```python { .api }
150
def create(
151
max_tokens_to_sample: int,
152
model: str,
153
prompt: str,
154
*,
155
metadata: Optional[MetadataParam] = None,
156
stop_sequences: Optional[List[str]] = None,
157
stream: Optional[bool] = None,
158
temperature: Optional[float] = None,
159
top_k: Optional[int] = None,
160
top_p: Optional[float] = None,
161
**kwargs
162
) -> Completion: ...
163
```
164
165
[Text Completions](./completions.md)
166
167
### Streaming Interface
168
169
Real-time message streaming with event handlers for processing partial responses, tool use events, and completion updates as they arrive.
170
171
```python { .api }
172
class MessageStream:
173
def __enter__(self) -> MessageStream: ...
174
def __exit__(self, exc_type, exc_val, exc_tb) -> None: ...
175
def __iter__(self) -> Iterator[MessageStreamEvent]: ...
176
177
def on_text(self, handler: Callable[[TextEvent], None]) -> MessageStream: ...
178
def on_input_json(self, handler: Callable[[InputJsonEvent], None]) -> MessageStream: ...
179
def on_message_stop(self, handler: Callable[[MessageStopEvent], None]) -> MessageStream: ...
180
def get_final_message(self) -> Message: ...
181
182
class AsyncMessageStream:
183
async def __aenter__(self) -> AsyncMessageStream: ...
184
async def __aexit__(self, exc_type, exc_val, exc_tb) -> None: ...
185
def __aiter__(self) -> AsyncIterator[MessageStreamEvent]: ...
186
187
def on_text(self, handler: Callable[[TextEvent], Awaitable[None]]) -> AsyncMessageStream: ...
188
def on_input_json(self, handler: Callable[[InputJsonEvent], Awaitable[None]]) -> AsyncMessageStream: ...
189
def on_message_stop(self, handler: Callable[[MessageStopEvent], Awaitable[None]]) -> AsyncMessageStream: ...
190
async def get_final_message(self) -> Message: ...
191
```
192
193
[Streaming](./streaming.md)
194
195
### Models API
196
197
Access to available Claude models, including model information, capabilities, and metadata for selecting appropriate models for different use cases.
198
199
```python { .api }
200
def list(**kwargs) -> List[Model]: ...
201
async def list(**kwargs) -> List[Model]: ...
202
```
203
204
[Models API](./models.md)
205
206
### Tool Use (Function Calling)
207
208
Integration system for connecting Claude with external functions and APIs, enabling Claude to use tools, call functions, and interact with external systems.
209
210
```python { .api }
211
class ToolParam(TypedDict):
212
name: str
213
description: str
214
input_schema: Dict[str, Any]
215
216
class ToolChoiceParam(TypedDict, total=False):
217
type: Literal["auto", "any", "none", "tool"]
218
name: Optional[str]
219
220
class ToolUseBlock(TypedDict):
221
type: Literal["tool_use"]
222
id: str
223
name: str
224
input: Dict[str, Any]
225
226
class ToolResultBlockParam(TypedDict):
227
type: Literal["tool_result"]
228
tool_use_id: str
229
content: Union[str, List[ContentBlockParam]]
230
is_error: Optional[bool]
231
```
232
233
[Tool Use](./tools.md)
234
235
### Message Batching
236
237
Efficient processing of multiple message requests in batches, providing cost optimization and throughput improvements for high-volume applications.
238
239
```python { .api }
240
def create(
241
requests: List[Dict[str, Any]],
242
**kwargs
243
) -> Any: ...
244
245
def retrieve(batch_id: str, **kwargs) -> Any: ...
246
def list(**kwargs) -> Any: ...
247
def cancel(batch_id: str, **kwargs) -> Any: ...
248
```
249
250
[Message Batching](./batching.md)
251
252
### Beta Features
253
254
Experimental and preview features including advanced capabilities, new model features, and cutting-edge functionality.
255
256
```python { .api }
257
class Beta:
258
messages: BetaMessages
259
models: BetaModels
260
files: BetaFiles
261
262
class AsyncBeta:
263
messages: AsyncBetaMessages
264
models: AsyncBetaModels
265
files: AsyncBetaFiles
266
```
267
268
[Beta Features](./beta.md)
269
270
### AWS Bedrock Integration
271
272
Specialized client for accessing Claude models through Amazon Bedrock, with AWS authentication and Bedrock-specific configurations.
273
274
```python { .api }
275
class AnthropicBedrock:
276
def __init__(
277
self,
278
*,
279
aws_access_key: Optional[str] = None,
280
aws_secret_key: Optional[str] = None,
281
aws_session_token: Optional[str] = None,
282
aws_region: Optional[str] = None,
283
**kwargs
284
): ...
285
286
messages: Messages
287
completions: Completions
288
289
class AsyncAnthropicBedrock:
290
def __init__(
291
self,
292
*,
293
aws_access_key: Optional[str] = None,
294
aws_secret_key: Optional[str] = None,
295
aws_session_token: Optional[str] = None,
296
aws_region: Optional[str] = None,
297
**kwargs
298
): ...
299
300
messages: AsyncMessages
301
completions: AsyncCompletions
302
```
303
304
[AWS Bedrock](./bedrock.md)
305
306
### Google Vertex AI Integration
307
308
Specialized client for accessing Claude models through Google Cloud Vertex AI, with Google Cloud authentication and Vertex-specific configurations.
309
310
```python { .api }
311
class AnthropicVertex:
312
def __init__(
313
self,
314
*,
315
project_id: Optional[str] = None,
316
region: Optional[str] = None,
317
**kwargs
318
): ...
319
320
messages: Messages
321
completions: Completions
322
323
class AsyncAnthropicVertex:
324
def __init__(
325
self,
326
*,
327
project_id: Optional[str] = None,
328
region: Optional[str] = None,
329
**kwargs
330
): ...
331
332
messages: AsyncMessages
333
completions: AsyncCompletions
334
```
335
336
[Google Vertex AI](./vertex.md)
337
338
### Error Handling
339
340
Comprehensive exception hierarchy for handling API errors, network issues, authentication problems, and service-specific errors.
341
342
```python { .api }
343
class AnthropicError(Exception): ...
344
class APIError(AnthropicError): ...
345
class APIStatusError(APIError): ...
346
class APITimeoutError(APIError): ...
347
class APIConnectionError(APIError): ...
348
class APIResponseValidationError(APIError): ...
349
350
class BadRequestError(APIStatusError): ...
351
class AuthenticationError(APIStatusError): ...
352
class PermissionDeniedError(APIStatusError): ...
353
class NotFoundError(APIStatusError): ...
354
class ConflictError(APIStatusError): ...
355
class UnprocessableEntityError(APIStatusError): ...
356
class RateLimitError(APIStatusError): ...
357
class InternalServerError(APIStatusError): ...
358
```
359
360
[Error Handling](./errors.md)
361
362
### Configuration and Utilities
363
364
Client configuration options, HTTP settings, timeout management, retry policies, and utility functions for file handling and request customization.
365
366
```python { .api }
367
class Anthropic:
368
def __init__(
369
self,
370
*,
371
api_key: Optional[str] = None,
372
base_url: Optional[str] = None,
373
timeout: Optional[Timeout] = None,
374
max_retries: Optional[int] = None,
375
default_headers: Optional[Mapping[str, str]] = None,
376
default_query: Optional[Mapping[str, object]] = None,
377
http_client: Optional[httpx.Client] = None,
378
**kwargs
379
): ...
380
381
@property
382
def with_raw_response(self) -> AnthropicWithRawResponse: ...
383
384
@property
385
def with_streaming_response(self) -> AnthropicWithStreamedResponse: ...
386
387
def file_from_path(path: Union[str, Path]) -> Any: ...
388
```
389
390
[Configuration](./configuration.md)