or run

npx @tessl/cli init
Log in

Version

Tile

Overview

Evals

Files

Files

docs

batches.mdbeta.mdclient-initialization.mderrors.mdindex.mdmessages.mdmodels.mdplatform-clients.mdstreaming.mdtools-builtin.mdtools-decorators.mdtools-function.mdtools-memory.mdtools-runners.mdtools.mdtypes.md

index.mddocs/

0

# Anthropic Python SDK

1

2

The official Python library for the Anthropic API, providing access to Claude AI models for conversational AI, content generation, and text analysis.

3

4

## Package Information

5

6

- **Package Name**: anthropic

7

- **Package Type**: pypi

8

- **Language**: Python

9

- **Installation**: `pip install anthropic`

10

- **Version**: 0.74.0

11

12

## Core Imports

13

14

```python

15

from anthropic import Anthropic, AsyncAnthropic

16

```

17

18

For types:

19

20

```python

21

from anthropic.types import Message, ContentBlock, Usage

22

```

23

24

## Basic Usage

25

26

```python

27

from anthropic import Anthropic

28

29

# Initialize the client with your API key

30

client = Anthropic(api_key="your-api-key")

31

32

# Create a message

33

message = client.messages.create(

34

model="claude-3-5-sonnet-20241022",

35

max_tokens=1024,

36

messages=[

37

{"role": "user", "content": "Hello, Claude!"}

38

]

39

)

40

41

print(message.content[0].text)

42

```

43

44

Async usage:

45

46

```python

47

from anthropic import AsyncAnthropic

48

import asyncio

49

50

async def main():

51

client = AsyncAnthropic(api_key="your-api-key")

52

53

message = await client.messages.create(

54

model="claude-3-5-sonnet-20241022",

55

max_tokens=1024,

56

messages=[

57

{"role": "user", "content": "Hello, Claude!"}

58

]

59

)

60

61

print(message.content[0].text)

62

63

asyncio.run(main())

64

```

65

66

## Architecture

67

68

The SDK is organized around these core components:

69

70

- **Client Classes**: Main entry points (`Anthropic`, `AsyncAnthropic`) with sync and async variants

71

- **Resource Classes**: API endpoint groups (messages, models, completions)

72

- **Type System**: Comprehensive Pydantic models for all requests and responses

73

- **Platform Integrations**: Support for AWS Bedrock, Google Vertex AI, and Azure AI Foundry

74

- **Streaming Support**: Real-time message generation via Server-Sent Events

75

- **Tool Calling**: Function decorators and automatic tool execution loops

76

- **Beta Features**: Extended capabilities including file uploads and skills management

77

78

## Capabilities

79

80

### Client Initialization

81

82

Create and configure the main client for interacting with the Anthropic API.

83

84

```python { .api }

85

class Anthropic:

86

def __init__(

87

self,

88

api_key: str | None = None,

89

auth_token: str | None = None,

90

base_url: str | httpx.URL | None = None,

91

timeout: float | Timeout | None | NotGiven = NOT_GIVEN,

92

max_retries: int = 2,

93

default_headers: Mapping[str, str] | None = None,

94

default_query: Mapping[str, object] | None = None,

95

http_client: httpx.Client | None = None,

96

): ...

97

```

98

99

```python { .api }

100

class AsyncAnthropic:

101

def __init__(

102

self,

103

api_key: str | None = None,

104

auth_token: str | None = None,

105

base_url: str | httpx.URL | None = None,

106

timeout: float | Timeout | None | NotGiven = NOT_GIVEN,

107

max_retries: int = 2,

108

default_headers: Mapping[str, str] | None = None,

109

default_query: Mapping[str, object] | None = None,

110

http_client: httpx.AsyncClient | None = None,

111

): ...

112

```

113

114

The `api_key` defaults to the `ANTHROPIC_API_KEY` environment variable if not provided. The `base_url` defaults to `https://api.anthropic.com`.

115

116

[Client Initialization](./client-initialization.md)

117

118

### Message Creation

119

120

Create conversational messages with Claude models, including multi-turn conversations and system prompts.

121

122

```python { .api }

123

def create(

124

self,

125

*,

126

max_tokens: int,

127

messages: List[MessageParam],

128

model: str,

129

metadata: MetadataParam | None = None,

130

service_tier: str | None = None,

131

stop_sequences: List[str] | None = None,

132

stream: bool = False,

133

system: str | List[TextBlockParam] | None = None,

134

temperature: float | None = None,

135

thinking: ThinkingConfigParam | None = None,

136

tool_choice: ToolChoiceParam | None = None,

137

tools: List[ToolUnionParam] | None = None,

138

top_k: int | None = None,

139

top_p: float | None = None,

140

betas: List[str] | None = None,

141

) -> Message: ...

142

```

143

144

[Messages API](./messages.md)

145

146

### Streaming Messages

147

148

Stream message generation in real-time with automatic text accumulation and event handling.

149

150

```python { .api }

151

def stream(

152

self,

153

*,

154

max_tokens: int,

155

messages: List[MessageParam],

156

model: str,

157

# ... same parameters as create

158

) -> MessageStreamManager: ...

159

```

160

161

[Streaming](./streaming.md)

162

163

### Tool Calling

164

165

Convert Python functions to tools that Claude can call, with automatic execution loops.

166

167

```python { .api }

168

def beta_tool(func: Callable) -> BetaFunctionTool: ...

169

170

def beta_async_tool(func: Callable) -> BetaAsyncFunctionTool: ...

171

```

172

173

[Tool Calling](./tools.md)

174

175

### Message Batches

176

177

Process multiple message requests efficiently in batches with automatic result aggregation.

178

179

```python { .api }

180

def create(

181

self,

182

*,

183

requests: List[MessageBatchRequestParam],

184

) -> MessageBatch: ...

185

186

def retrieve(

187

self,

188

message_batch_id: str,

189

) -> MessageBatch: ...

190

191

def results(

192

self,

193

message_batch_id: str,

194

) -> Iterator[MessageBatchIndividualResponse]: ...

195

```

196

197

[Batch Processing](./batches.md)

198

199

### Platform-Specific Clients

200

201

Use Claude models through AWS Bedrock, Google Vertex AI, or Azure AI Foundry with platform-native authentication.

202

203

```python { .api }

204

class AnthropicBedrock:

205

def __init__(

206

self,

207

*,

208

aws_region: str | None = None,

209

aws_access_key: str | None = None,

210

aws_secret_key: str | None = None,

211

aws_session_token: str | None = None,

212

# ... standard client parameters

213

): ...

214

```

215

216

```python { .api }

217

class AnthropicVertex:

218

def __init__(

219

self,

220

*,

221

region: str | NotGiven = NOT_GIVEN,

222

project_id: str | NotGiven = NOT_GIVEN,

223

access_token: str | None = None,

224

credentials: GoogleCredentials | None = None,

225

# ... standard client parameters

226

): ...

227

```

228

229

```python { .api }

230

class AnthropicFoundry:

231

def __init__(

232

self,

233

*,

234

resource: str | None = None,

235

api_key: str | None = None,

236

azure_ad_token_provider: Callable | None = None,

237

base_url: str | None = None,

238

# ... standard client parameters

239

): ...

240

```

241

242

[Platform Clients](./platform-clients.md)

243

244

### Beta Features

245

246

Access experimental features including file uploads, skills management, and extended thinking capabilities.

247

248

```python { .api }

249

# File uploads

250

def upload(

251

self,

252

*,

253

file: FileContent,

254

purpose: str,

255

) -> FileMetadata: ...

256

257

# Skills management

258

def create(

259

self,

260

*,

261

container: SkillContainerParam,

262

description: str,

263

name: str,

264

) -> SkillCreateResponse: ...

265

```

266

267

[Beta Features](./beta.md)

268

269

### Model Information

270

271

Retrieve information about available models and their capabilities.

272

273

```python { .api }

274

def retrieve(

275

self,

276

model_id: str,

277

*,

278

betas: List[str] | None = None,

279

) -> ModelInfo: ...

280

281

def list(

282

self,

283

*,

284

after_id: str | None = None,

285

before_id: str | None = None,

286

limit: int | None = None,

287

betas: List[str] | None = None,

288

) -> SyncPage[ModelInfo]: ...

289

```

290

291

[Models](./models.md)

292

293

### Error Handling

294

295

Handle API errors with specific exception types for different error conditions.

296

297

```python { .api }

298

class AnthropicError(Exception): ...

299

class APIError(AnthropicError): ...

300

class APIStatusError(APIError): ...

301

class BadRequestError(APIStatusError): ...

302

class AuthenticationError(APIStatusError): ...

303

class PermissionDeniedError(APIStatusError): ...

304

class NotFoundError(APIStatusError): ...

305

class RateLimitError(APIStatusError): ...

306

class InternalServerError(APIStatusError): ...

307

class APIConnectionError(AnthropicError): ...

308

class APITimeoutError(AnthropicError): ...

309

```

310

311

[Error Handling](./errors.md)

312

313

## Types

314

315

The SDK uses Pydantic models for type safety. Key types include:

316

317

- `Message`: Complete message response with content blocks

318

- `ContentBlock`: Text, tool use, or thinking blocks

319

- `MessageParam`: Input message format

320

- `ToolParam`: Tool definition

321

- `ModelInfo`: Model metadata

322

- `Usage`: Token usage statistics

323

324

[Core Types](./types.md)

325

326

## Constants

327

328

```python { .api }

329

HUMAN_PROMPT: str = "\n\nHuman:"

330

AI_PROMPT: str = "\n\nAssistant:"

331

```

332

333

Legacy prompt constants for the older completion API.

334