The official Python library for the anthropic API
—
Quality
Pending
Does it follow best practices?
Impact
Pending
No eval scenarios have been run
Step-by-step guide to using the Anthropic Python SDK.
pip install anthropicOptional extras:
pip install anthropic[bedrock] # AWS Bedrock
pip install anthropic[vertex] # Google Vertex AI
pip install anthropic[aiohttp] # Alternative async HTTPSet your API key as an environment variable:
export ANTHROPIC_API_KEY='your-api-key'Or pass it explicitly:
from anthropic import Anthropic
client = Anthropic(api_key="your-api-key")from anthropic import Anthropic
client = Anthropic()
message = client.messages.create(
model="claude-sonnet-4-5-20250929",
max_tokens=1024,
messages=[
{"role": "user", "content": "Hello, Claude!"}
]
)
print(message.content[0].text)Configure Claude's behavior with system prompts:
message = client.messages.create(
model="claude-sonnet-4-5-20250929",
max_tokens=1024,
system="You are a helpful AI assistant specializing in Python programming.",
messages=[
{"role": "user", "content": "How do I read a file?"}
]
)Maintain conversation history:
messages = [
{"role": "user", "content": "My name is Alice."},
{"role": "assistant", "content": "Hello Alice! Nice to meet you."},
{"role": "user", "content": "What's my name?"}
]
message = client.messages.create(
model="claude-sonnet-4-5-20250929",
max_tokens=1024,
messages=messages
)
print(message.content[0].text) # "Your name is Alice."Stream responses for real-time feedback:
with client.messages.stream(
model="claude-sonnet-4-5-20250929",
max_tokens=1024,
messages=[
{"role": "user", "content": "Write a short story"}
]
) as stream:
for text in stream.text_stream:
print(text, end="", flush=True)
print()Send images to Claude:
import base64
with open("image.jpg", "rb") as f:
image_data = base64.standard_b64encode(f.read()).decode()
message = client.messages.create(
model="claude-sonnet-4-5-20250929",
max_tokens=1024,
messages=[{
"role": "user",
"content": [
{
"type": "image",
"source": {
"type": "base64",
"media_type": "image/jpeg",
"data": image_data
}
},
{"type": "text", "text": "What's in this image?"}
]
}]
)Always handle potential errors:
from anthropic import APIError, RateLimitError
try:
message = client.messages.create(
model="claude-sonnet-4-5-20250929",
max_tokens=1024,
messages=[{"role": "user", "content": "Hello"}]
)
except RateLimitError as e:
print(f"Rate limited. Retry after {e.response.headers.get('retry-after')}s")
except APIError as e:
print(f"API error: {e.message}")For async applications:
import asyncio
from anthropic import AsyncAnthropic
async def main():
client = AsyncAnthropic()
message = await client.messages.create(
model="claude-sonnet-4-5-20250929",
max_tokens=1024,
messages=[{"role": "user", "content": "Hello"}]
)
print(message.content[0].text)
asyncio.run(main())with Anthropic() as client:
message = client.messages.create(...)
# Client automatically closedtry:
message = client.messages.create(...)
except APIError as e:
# Handle error
...claude-sonnet-4-5-20250929 - Balanced intelligence and speedclaude-opus-4-5-20250929 - Maximum capabilityclaude-3-5-haiku-20241022 - Fast and cost-effectiveimport httpx
client = Anthropic(
timeout=httpx.Timeout(60.0)
)message = client.messages.create(...)
print(f"Input tokens: {message.usage.input_tokens}")
print(f"Output tokens: {message.usage.output_tokens}")