- Spec files
pypi-anthropic
Describes: pkg:pypi/anthropic@0.66.x
- Description
- The official Python library for the anthropic API
- Author
- tessl
- Last updated
messages.md docs/
1# Messages API23The Messages API is the primary interface for conversational interactions with Claude models. It supports multi-turn conversations, system prompts, tool use, streaming responses, and message batching with comprehensive type safety.45## Capabilities67### Message Creation89Create conversational messages with Claude models, supporting various content types, system prompts, and tool integration.1011```python { .api }12def create(13max_tokens: int,14messages: List[MessageParam],15model: str,16*,17metadata: Optional[MetadataParam] = None,18service_tier: Optional[Literal["auto", "standard_only"]] = None,19stop_sequences: Optional[List[str]] = None,20stream: Optional[bool] = None,21system: Optional[str] = None,22temperature: Optional[float] = None,23thinking: Optional[ThinkingConfigParam] = None,24tool_choice: Optional[ToolChoiceParam] = None,25tools: Optional[List[ToolParam]] = None,26top_k: Optional[int] = None,27top_p: Optional[float] = None,28**kwargs29) -> Message3031async def create(32max_tokens: int,33messages: List[MessageParam],34model: str,35*,36metadata: Optional[MetadataParam] = None,37service_tier: Optional[Literal["auto", "standard_only"]] = None,38stop_sequences: Optional[List[str]] = None,39stream: Optional[bool] = None,40system: Optional[str] = None,41temperature: Optional[float] = None,42thinking: Optional[ThinkingConfigParam] = None,43tool_choice: Optional[ToolChoiceParam] = None,44tools: Optional[List[ToolParam]] = None,45top_k: Optional[int] = None,46top_p: Optional[float] = None,47**kwargs48) -> Message49```5051### Token Counting5253Count tokens in messages before sending to estimate costs and ensure messages fit within model limits.5455```python { .api }56def count_tokens(57messages: List[MessageParam],58model: str,59*,60system: Optional[str] = None,61tool_choice: Optional[ToolChoiceParam] = None,62tools: Optional[List[ToolParam]] = None,63**kwargs64) -> MessageTokensCount6566async def count_tokens(67messages: List[MessageParam],68model: str,69*,70system: Optional[str] = None,71tool_choice: Optional[ToolChoiceParam] = None,72tools: Optional[List[ToolParam]] = None,73**kwargs74) -> MessageTokensCount75```7677## Core Types7879### Message Types8081```python { .api }82class Message(TypedDict):83id: str84type: Literal["message"]85role: Literal["assistant"]86content: List[ContentBlock]87model: str88stop_reason: Optional[StopReason]89stop_sequence: Optional[str]90usage: Usage9192class MessageParam(TypedDict):93role: Literal["user", "assistant"]94content: Union[str, List[ContentBlockParam]]9596class MessageTokensCount(TypedDict):97input_tokens: int98cache_creation_input_tokens: Optional[int]99cache_read_input_tokens: Optional[int]100```101102### Content Block Types103104```python { .api }105class ContentBlock(TypedDict):106type: str107108class TextBlock(ContentBlock):109type: Literal["text"]110text: str111112class ToolUseBlock(ContentBlock):113type: Literal["tool_use"]114id: str115name: str116input: Dict[str, Any]117118class ContentBlockParam(TypedDict):119type: str120121class TextBlockParam(ContentBlockParam):122type: Literal["text"]123text: str124cache_control: Optional[CacheControlEphemeralParam]125126class ImageBlockParam(ContentBlockParam):127type: Literal["image"]128source: Union[Base64ImageSourceParam, URLImageSourceParam]129cache_control: Optional[CacheControlEphemeralParam]130131class DocumentBlockParam(ContentBlockParam):132type: Literal["document"]133source: Union[Base64PDFSourceParam, URLPDFSourceParam]134cache_control: Optional[CacheControlEphemeralParam]135136class ToolUseBlockParam(ContentBlockParam):137type: Literal["tool_use"]138id: str139name: str140input: Dict[str, Any]141cache_control: Optional[CacheControlEphemeralParam]142143class ToolResultBlockParam(ContentBlockParam):144type: Literal["tool_result"]145tool_use_id: str146content: Union[str, List[ContentBlockParam]]147is_error: Optional[bool]148cache_control: Optional[CacheControlEphemeralParam]149```150151### Image and Document Sources152153```python { .api }154class Base64ImageSourceParam(TypedDict):155type: Literal["base64"]156media_type: Literal["image/jpeg", "image/png", "image/gif", "image/webp"]157data: str158159class URLImageSourceParam(TypedDict):160type: Literal["url"]161url: str162163class Base64PDFSourceParam(TypedDict):164type: Literal["base64"]165media_type: Literal["application/pdf"]166data: str167168class URLPDFSourceParam(TypedDict):169type: Literal["url"]170url: str171172class PlainTextSourceParam(TypedDict):173type: Literal["text"]174text: str175```176177### Usage and Metadata178179```python { .api }180class Usage(TypedDict):181input_tokens: int182output_tokens: int183cache_creation_input_tokens: Optional[int]184cache_read_input_tokens: Optional[int]185186class MetadataParam(TypedDict, total=False):187user_id: Optional[str]188189class CacheControlEphemeralParam(TypedDict):190type: Literal["ephemeral"]191192class StopReason(TypedDict):193type: Literal["end_turn", "max_tokens", "stop_sequence", "tool_use"]194```195196### Extended Thinking Configuration197198```python { .api }199class ThinkingConfigParam(TypedDict, total=False):200type: Literal["enabled", "disabled"]201budget_tokens: Optional[int] # Required when type="enabled", must be ≥1024 and < max_tokens202203class ThinkingConfigEnabledParam(TypedDict):204type: Literal["enabled"]205budget_tokens: int # Determines how many tokens Claude can use for internal reasoning206207class ThinkingConfigDisabledParam(TypedDict):208type: Literal["disabled"]209```210211## Usage Examples212213### Basic Text Message214215```python216from anthropic import Anthropic217218client = Anthropic()219220message = client.messages.create(221model="claude-sonnet-4-20250514",222max_tokens=1024,223messages=[224{"role": "user", "content": "What is the capital of France?"}225]226)227228print(message.content[0].text)229```230231### Multi-Turn Conversation232233```python234messages = [235{"role": "user", "content": "Hello, can you help me with Python?"},236{"role": "assistant", "content": "Of course! I'd be happy to help you with Python. What specific topic or problem would you like assistance with?"},237{"role": "user", "content": "How do I read a CSV file?"}238]239240message = client.messages.create(241model="claude-sonnet-4-20250514",242max_tokens=1024,243messages=messages244)245```246247### System Prompt248249```python250message = client.messages.create(251model="claude-sonnet-4-20250514",252max_tokens=1024,253system="You are a helpful coding assistant. Always provide code examples when relevant.",254messages=[255{"role": "user", "content": "How do I sort a list in Python?"}256]257)258```259260### Image Input261262```python263import base64264265# Read and encode image266with open("image.jpg", "rb") as img_file:267img_data = base64.b64encode(img_file.read()).decode()268269message = client.messages.create(270model="claude-sonnet-4-20250514",271max_tokens=1024,272messages=[273{274"role": "user",275"content": [276{277"type": "image",278"source": {279"type": "base64",280"media_type": "image/jpeg",281"data": img_data282}283},284{285"type": "text",286"text": "What do you see in this image?"287}288]289}290]291)292```293294### PDF Document Input295296```python297import base64298299# Read and encode PDF300with open("document.pdf", "rb") as pdf_file:301pdf_data = base64.b64encode(pdf_file.read()).decode()302303message = client.messages.create(304model="claude-sonnet-4-20250514",305max_tokens=1024,306messages=[307{308"role": "user",309"content": [310{311"type": "document",312"source": {313"type": "base64",314"media_type": "application/pdf",315"data": pdf_data316}317},318{319"type": "text",320"text": "Summarize this document"321}322]323}324]325)326```327328### Token Counting Example329330```python331# Count tokens before sending332token_count = client.messages.count_tokens(333model="claude-sonnet-4-20250514",334messages=[335{"role": "user", "content": "What is the capital of France?"}336]337)338339print(f"Input tokens: {token_count.input_tokens}")340341if token_count.input_tokens < 4000: # Model's context limit342message = client.messages.create(343model="claude-sonnet-4-20250514",344max_tokens=1024,345messages=[346{"role": "user", "content": "What is the capital of France?"}347]348)349```350351### Streaming Messages352353```python354with client.messages.stream(355model="claude-sonnet-4-20250514",356max_tokens=1024,357messages=[358{"role": "user", "content": "Write a short story"}359]360) as stream:361for text in stream.text_stream:362print(text, end="", flush=True)363```364365### Async Usage366367```python368import asyncio369from anthropic import AsyncAnthropic370371async def chat():372client = AsyncAnthropic()373374message = await client.messages.create(375model="claude-sonnet-4-20250514",376max_tokens=1024,377messages=[378{"role": "user", "content": "Hello!"}379]380)381382return message.content[0].text383384result = asyncio.run(chat())385```386387### Extended Thinking Example388389```python390# Enable extended thinking for complex analysis391message = client.messages.create(392model="claude-sonnet-4-20250514",393max_tokens=4000,394thinking={395"type": "enabled",396"budget_tokens": 2000 # Allow Claude 2000 tokens for internal reasoning397},398messages=[399{"role": "user", "content": "Analyze this complex business problem and provide a detailed solution..."}400]401)402```403404### Service Tier Example405406```python407# Use priority capacity when available408message = client.messages.create(409model="claude-sonnet-4-20250514",410max_tokens=1024,411service_tier="auto", # Use priority capacity if available, fallback to standard412messages=[413{"role": "user", "content": "Urgent request requiring priority processing"}414]415)416```