- Spec files
pypi-anthropic
Describes: pkg:pypi/anthropic@0.66.x
- Description
- The official Python library for the anthropic API
- Author
- tessl
- Last updated
completions.md docs/
1# Text Completions API23The Text Completions API provides direct text completion functionality using Claude models with a prompt-based approach. This API is primarily used for specific use cases requiring the legacy completion format or when working with prompts that don't fit the conversational message format.45## Capabilities67### Completion Creation89Generate text completions from prompts with configurable parameters for controlling output generation.1011```python { .api }12def create(13max_tokens_to_sample: int,14model: str,15prompt: str,16*,17metadata: Optional[MetadataParam] = None,18stop_sequences: Optional[List[str]] = None,19stream: Optional[bool] = None,20temperature: Optional[float] = None,21top_k: Optional[int] = None,22top_p: Optional[float] = None,23**kwargs24) -> Completion2526async def create(27max_tokens_to_sample: int,28model: str,29prompt: str,30*,31metadata: Optional[MetadataParam] = None,32stop_sequences: Optional[List[str]] = None,33stream: Optional[bool] = None,34temperature: Optional[float] = None,35top_k: Optional[int] = None,36top_p: Optional[float] = None,37**kwargs38) -> Completion39```4041## Core Types4243### Completion Types4445```python { .api }46class Completion(TypedDict):47id: str48type: Literal["completion"]49completion: str50stop_reason: Optional[StopReason]51model: str5253class CompletionCreateParams(TypedDict):54max_tokens_to_sample: int55model: str56prompt: str57metadata: Optional[MetadataParam]58stop_sequences: Optional[List[str]]59stream: Optional[bool]60temperature: Optional[float]61top_k: Optional[int]62top_p: Optional[float]6364class StopReason(TypedDict):65type: Literal["stop_sequence", "max_tokens"]66```6768### Parameter Types6970```python { .api }71class MetadataParam(TypedDict, total=False):72user_id: Optional[str]73```7475## Usage Examples7677### Basic Text Completion7879```python80from anthropic import Anthropic8182client = Anthropic()8384completion = client.completions.create(85model="claude-2.1",86prompt="Human: What is the capital of France?\n\nAssistant:",87max_tokens_to_sample=10088)8990print(completion.completion)91# Output: " The capital of France is Paris."92```9394### Completion with Stop Sequences9596```python97completion = client.completions.create(98model="claude-2.1",99prompt="List three fruits:\n1.",100max_tokens_to_sample=50,101stop_sequences=["\n4."]102)103104print(completion.completion)105# Output: " Apple\n2. Banana\n3. Orange"106```107108### Temperature Control109110```python111# Lower temperature for more focused, deterministic output112focused_completion = client.completions.create(113model="claude-2.1",114prompt="The scientific name for water is",115max_tokens_to_sample=20,116temperature=0.1117)118119# Higher temperature for more creative, varied output120creative_completion = client.completions.create(121model="claude-2.1",122prompt="Write a creative opening line for a story:",123max_tokens_to_sample=50,124temperature=0.9125)126```127128### Top-k and Top-p Sampling129130```python131# Top-k sampling: limit to top 10 most likely tokens132completion = client.completions.create(133model="claude-2.1",134prompt="The weather today is",135max_tokens_to_sample=30,136top_k=10137)138139# Top-p (nucleus) sampling: limit to tokens comprising top 90% probability mass140completion = client.completions.create(141model="claude-2.1",142prompt="The weather today is",143max_tokens_to_sample=30,144top_p=0.9145)146```147148### Streaming Completions149150```python151stream = client.completions.create(152model="claude-2.1",153prompt="Write a short poem about mountains:",154max_tokens_to_sample=200,155stream=True156)157158for completion in stream:159print(completion.completion, end="", flush=True)160```161162### Multiple Stop Sequences163164```python165completion = client.completions.create(166model="claude-2.1",167prompt="Q: What is 2+2?\nA:",168max_tokens_to_sample=100,169stop_sequences=["\n", "Q:", "Human:"]170)171172print(completion.completion.strip())173# Output: " 4"174```175176### Legacy Prompt Format177178```python179# Using the legacy Human/Assistant format180from anthropic import Anthropic, HUMAN_PROMPT, AI_PROMPT181182client = Anthropic()183184prompt = f"{HUMAN_PROMPT} Can you explain photosynthesis in simple terms?{AI_PROMPT}"185186completion = client.completions.create(187model="claude-2.1",188prompt=prompt,189max_tokens_to_sample=200190)191192print(completion.completion)193```194195### Async Completions196197```python198import asyncio199from anthropic import AsyncAnthropic200201async def async_completion_example():202client = AsyncAnthropic()203204completion = await client.completions.create(205model="claude-2.1",206prompt="The future of artificial intelligence is",207max_tokens_to_sample=100,208temperature=0.7209)210211return completion.completion212213result = asyncio.run(async_completion_example())214print(result)215```216217### Error Handling with Completions218219```python220from anthropic import Anthropic, RateLimitError, APITimeoutError221222client = Anthropic()223224try:225completion = client.completions.create(226model="claude-2.1",227prompt="Write a haiku about programming",228max_tokens_to_sample=50229)230231print(completion.completion)232233except RateLimitError as e:234print(f"Rate limited: {e}")235236except APITimeoutError as e:237print(f"Request timed out: {e}")238239except Exception as e:240print(f"Unexpected error: {e}")241```