- Spec files
pypi-pydantic-ai
Describes: pkg:pypi/pydantic-ai@0.8.x
- Description
- Agent Framework / shim to use Pydantic with LLMs
- Author
- tessl
- Last updated
models.md docs/
1# Model Integration23Comprehensive model abstraction supporting 10+ LLM providers including OpenAI, Anthropic, Google, Groq, Cohere, Mistral, and more. Provides unified interface with provider-specific optimizations and fallback capabilities.45## Capabilities67### OpenAI Models89Integration with OpenAI's GPT models including GPT-4, GPT-3.5-turbo, and other OpenAI models.1011```python { .api }12class OpenAIModel:13"""14OpenAI model integration supporting GPT-4, GPT-3.5-turbo, and other OpenAI models.15"""16def __init__(17self,18model_name: str,19*,20api_key: str | None = None,21base_url: str | None = None,22openai_client: OpenAI | None = None,23timeout: float | None = None24):25"""26Initialize OpenAI model.2728Parameters:29- model_name: OpenAI model name (e.g., 'gpt-4', 'gpt-3.5-turbo')30- api_key: OpenAI API key (defaults to OPENAI_API_KEY env var)31- base_url: Custom base URL for OpenAI API32- openai_client: Pre-configured OpenAI client instance33- timeout: Request timeout in seconds34"""35```3637### Anthropic Models3839Integration with Anthropic's Claude models including Claude-3.5, Claude-3, and other Anthropic models.4041```python { .api }42class AnthropicModel:43"""44Anthropic model integration supporting Claude-3.5, Claude-3, and other Anthropic models.45"""46def __init__(47self,48model_name: str,49*,50api_key: str | None = None,51base_url: str | None = None,52anthropic_client: Anthropic | None = None,53timeout: float | None = None54):55"""56Initialize Anthropic model.5758Parameters:59- model_name: Anthropic model name (e.g., 'claude-3-5-sonnet-20241022')60- api_key: Anthropic API key (defaults to ANTHROPIC_API_KEY env var)61- base_url: Custom base URL for Anthropic API62- anthropic_client: Pre-configured Anthropic client instance63- timeout: Request timeout in seconds64"""65```6667### Google Models6869Integration with Google's Gemini and other Google AI models.7071```python { .api }72class GeminiModel:73"""74Google Gemini model integration.75"""76def __init__(77self,78model_name: str,79*,80api_key: str | None = None,81timeout: float | None = None82):83"""84Initialize Gemini model.8586Parameters:87- model_name: Gemini model name (e.g., 'gemini-1.5-pro')88- api_key: Google API key (defaults to GOOGLE_API_KEY env var)89- timeout: Request timeout in seconds90"""9192class GoogleModel:93"""94Google AI model integration for Vertex AI and other Google models.95"""96def __init__(97self,98model_name: str,99*,100project_id: str | None = None,101location: str = 'us-central1',102credentials: dict | None = None,103timeout: float | None = None104):105"""106Initialize Google AI model.107108Parameters:109- model_name: Google model name110- project_id: Google Cloud project ID111- location: Google Cloud region112- credentials: Service account credentials113- timeout: Request timeout in seconds114"""115```116117### Other Model Providers118119Support for additional LLM providers with consistent interface.120121```python { .api }122class GroqModel:123"""124Groq model integration for fast inference.125"""126def __init__(127self,128model_name: str,129*,130api_key: str | None = None,131timeout: float | None = None132): ...133134class CohereModel:135"""136Cohere model integration.137"""138def __init__(139self,140model_name: str,141*,142api_key: str | None = None,143timeout: float | None = None144): ...145146class MistralModel:147"""148Mistral AI model integration.149"""150def __init__(151self,152model_name: str,153*,154api_key: str | None = None,155timeout: float | None = None156): ...157158class HuggingFaceModel:159"""160HuggingFace model integration.161"""162def __init__(163self,164model_name: str,165*,166api_key: str | None = None,167base_url: str | None = None,168timeout: float | None = None169): ...170```171172### AWS Bedrock Integration173174Integration with AWS Bedrock for accessing various models through AWS infrastructure.175176```python { .api }177class BedrockModel:178"""179AWS Bedrock model integration.180"""181def __init__(182self,183model_name: str,184*,185region: str | None = None,186aws_access_key_id: str | None = None,187aws_secret_access_key: str | None = None,188aws_session_token: str | None = None,189profile: str | None = None,190timeout: float | None = None191):192"""193Initialize Bedrock model.194195Parameters:196- model_name: Bedrock model ID197- region: AWS region198- aws_access_key_id: AWS access key199- aws_secret_access_key: AWS secret key200- aws_session_token: AWS session token201- profile: AWS profile name202- timeout: Request timeout in seconds203"""204```205206### Model Abstractions207208Core model interface and utilities for working with models.209210```python { .api }211class Model:212"""213Abstract model interface that all model implementations must follow.214"""215def name(self) -> str: ...216217async def request(218self,219messages: list[ModelMessage],220model_settings: ModelSettings | None = None221) -> ModelResponse: ...222223async def request_stream(224self,225messages: list[ModelMessage],226model_settings: ModelSettings | None = None227) -> StreamedResponse: ...228229class StreamedResponse:230"""231Streamed model response for real-time processing.232"""233async def __aiter__(self) -> AsyncIterator[ModelResponseStreamEvent]: ...234235async def get_final_response(self) -> ModelResponse: ...236237class InstrumentedModel:238"""239Model wrapper with OpenTelemetry instrumentation.240"""241def __init__(242self,243model: Model,244settings: InstrumentationSettings245): ...246```247248### Model Utilities249250Helper functions for working with models and model names.251252```python { .api }253def infer_model(model: Model | KnownModelName) -> Model:254"""255Infer model instance from string name or return existing model.256257Parameters:258- model: Model instance or known model name string259260Returns:261Model instance ready for use262"""263264def instrument_model(265model: Model,266settings: InstrumentationSettings | None = None267) -> InstrumentedModel:268"""269Add OpenTelemetry instrumentation to a model.270271Parameters:272- model: Model to instrument273- settings: Instrumentation configuration274275Returns:276Instrumented model wrapper277"""278```279280### Fallback Models281282Model that automatically falls back to alternative models on failure.283284```python { .api }285class FallbackModel:286"""287Model that falls back to alternative models on failure.288"""289def __init__(290self,291models: list[Model],292*,293max_retries: int = 3294):295"""296Initialize fallback model.297298Parameters:299- models: List of models to try in order300- max_retries: Maximum retry attempts per model301"""302```303304### Test Models305306Models designed for testing and development.307308```python { .api }309class TestModel:310"""311Test model implementation for testing and development.312"""313def __init__(314self,315*,316custom_result_text: str | None = None,317custom_result_tool_calls: list[ToolCallPart] | None = None,318custom_result_structured: Any = None319):320"""321Initialize test model with predefined responses.322323Parameters:324- custom_result_text: Fixed text response325- custom_result_tool_calls: Fixed tool calls to make326- custom_result_structured: Fixed structured response327"""328329class FunctionModel:330"""331Function-based model for custom logic during testing.332"""333def __init__(334self,335function: Callable[[list[ModelMessage]], ModelResponse | str],336*,337stream_function: Callable | None = None338):339"""340Initialize function model.341342Parameters:343- function: Function that processes messages and returns response344- stream_function: Optional function for streaming responses345"""346```347348### Known Model Names349350Type alias for all supported model name strings.351352```python { .api }353KnownModelName = Literal[354# OpenAI models355'gpt-4o',356'gpt-4o-mini',357'gpt-4-turbo',358'gpt-4',359'gpt-3.5-turbo',360'o1-preview',361'o1-mini',362363# Anthropic models364'claude-3-5-sonnet-20241022',365'claude-3-5-haiku-20241022',366'claude-3-opus-20240229',367'claude-3-sonnet-20240229',368'claude-3-haiku-20240307',369370# Google models371'gemini-1.5-pro',372'gemini-1.5-flash',373'gemini-1.0-pro',374375# And many more...376]377```378379## Usage Examples380381### Basic Model Usage382383```python384from pydantic_ai import Agent385from pydantic_ai.models import OpenAIModel, AnthropicModel386387# Using OpenAI388openai_agent = Agent(389model=OpenAIModel('gpt-4'),390system_prompt='You are a helpful assistant.'391)392393# Using Anthropic394anthropic_agent = Agent(395model=AnthropicModel('claude-3-5-sonnet-20241022'),396system_prompt='You are a helpful assistant.'397)398399# Using model name directly (auto-inferred)400agent = Agent(401model='gpt-4',402system_prompt='You are a helpful assistant.'403)404```405406### Model with Custom Configuration407408```python409from pydantic_ai.models import OpenAIModel410411# Custom OpenAI configuration412model = OpenAIModel(413'gpt-4',414api_key='your-api-key',415base_url='https://custom-endpoint.com/v1',416timeout=30.0417)418419agent = Agent(model=model, system_prompt='Custom configured agent.')420```421422### Fallback Model Configuration423424```python425from pydantic_ai.models import FallbackModel, OpenAIModel, AnthropicModel426427# Create fallback model that tries OpenAI first, then Anthropic428fallback_model = FallbackModel([429OpenAIModel('gpt-4'),430AnthropicModel('claude-3-5-sonnet-20241022')431])432433agent = Agent(434model=fallback_model,435system_prompt='Resilient agent with fallback.'436)437```438439### Testing with Test Models440441```python442from pydantic_ai.models import TestModel443444# Test model with fixed response445test_model = TestModel(custom_result_text='Fixed test response')446447agent = Agent(model=test_model, system_prompt='Test agent.')448result = agent.run_sync('Any input')449print(result.data) # "Fixed test response"450```