pypi-pydantic-ai

Description
Agent Framework / shim to use Pydantic with LLMs
Author
tessl
Last updated

How to use

npx @tessl/cli registry install tessl/pypi-pydantic-ai@0.8.0

models.md docs/

1
# Model Integration
2
3
Comprehensive model abstraction supporting 10+ LLM providers including OpenAI, Anthropic, Google, Groq, Cohere, Mistral, and more. Provides unified interface with provider-specific optimizations and fallback capabilities.
4
5
## Capabilities
6
7
### OpenAI Models
8
9
Integration with OpenAI's GPT models including GPT-4, GPT-3.5-turbo, and other OpenAI models.
10
11
```python { .api }
12
class OpenAIModel:
13
"""
14
OpenAI model integration supporting GPT-4, GPT-3.5-turbo, and other OpenAI models.
15
"""
16
def __init__(
17
self,
18
model_name: str,
19
*,
20
api_key: str | None = None,
21
base_url: str | None = None,
22
openai_client: OpenAI | None = None,
23
timeout: float | None = None
24
):
25
"""
26
Initialize OpenAI model.
27
28
Parameters:
29
- model_name: OpenAI model name (e.g., 'gpt-4', 'gpt-3.5-turbo')
30
- api_key: OpenAI API key (defaults to OPENAI_API_KEY env var)
31
- base_url: Custom base URL for OpenAI API
32
- openai_client: Pre-configured OpenAI client instance
33
- timeout: Request timeout in seconds
34
"""
35
```
36
37
### Anthropic Models
38
39
Integration with Anthropic's Claude models including Claude-3.5, Claude-3, and other Anthropic models.
40
41
```python { .api }
42
class AnthropicModel:
43
"""
44
Anthropic model integration supporting Claude-3.5, Claude-3, and other Anthropic models.
45
"""
46
def __init__(
47
self,
48
model_name: str,
49
*,
50
api_key: str | None = None,
51
base_url: str | None = None,
52
anthropic_client: Anthropic | None = None,
53
timeout: float | None = None
54
):
55
"""
56
Initialize Anthropic model.
57
58
Parameters:
59
- model_name: Anthropic model name (e.g., 'claude-3-5-sonnet-20241022')
60
- api_key: Anthropic API key (defaults to ANTHROPIC_API_KEY env var)
61
- base_url: Custom base URL for Anthropic API
62
- anthropic_client: Pre-configured Anthropic client instance
63
- timeout: Request timeout in seconds
64
"""
65
```
66
67
### Google Models
68
69
Integration with Google's Gemini and other Google AI models.
70
71
```python { .api }
72
class GeminiModel:
73
"""
74
Google Gemini model integration.
75
"""
76
def __init__(
77
self,
78
model_name: str,
79
*,
80
api_key: str | None = None,
81
timeout: float | None = None
82
):
83
"""
84
Initialize Gemini model.
85
86
Parameters:
87
- model_name: Gemini model name (e.g., 'gemini-1.5-pro')
88
- api_key: Google API key (defaults to GOOGLE_API_KEY env var)
89
- timeout: Request timeout in seconds
90
"""
91
92
class GoogleModel:
93
"""
94
Google AI model integration for Vertex AI and other Google models.
95
"""
96
def __init__(
97
self,
98
model_name: str,
99
*,
100
project_id: str | None = None,
101
location: str = 'us-central1',
102
credentials: dict | None = None,
103
timeout: float | None = None
104
):
105
"""
106
Initialize Google AI model.
107
108
Parameters:
109
- model_name: Google model name
110
- project_id: Google Cloud project ID
111
- location: Google Cloud region
112
- credentials: Service account credentials
113
- timeout: Request timeout in seconds
114
"""
115
```
116
117
### Other Model Providers
118
119
Support for additional LLM providers with consistent interface.
120
121
```python { .api }
122
class GroqModel:
123
"""
124
Groq model integration for fast inference.
125
"""
126
def __init__(
127
self,
128
model_name: str,
129
*,
130
api_key: str | None = None,
131
timeout: float | None = None
132
): ...
133
134
class CohereModel:
135
"""
136
Cohere model integration.
137
"""
138
def __init__(
139
self,
140
model_name: str,
141
*,
142
api_key: str | None = None,
143
timeout: float | None = None
144
): ...
145
146
class MistralModel:
147
"""
148
Mistral AI model integration.
149
"""
150
def __init__(
151
self,
152
model_name: str,
153
*,
154
api_key: str | None = None,
155
timeout: float | None = None
156
): ...
157
158
class HuggingFaceModel:
159
"""
160
HuggingFace model integration.
161
"""
162
def __init__(
163
self,
164
model_name: str,
165
*,
166
api_key: str | None = None,
167
base_url: str | None = None,
168
timeout: float | None = None
169
): ...
170
```
171
172
### AWS Bedrock Integration
173
174
Integration with AWS Bedrock for accessing various models through AWS infrastructure.
175
176
```python { .api }
177
class BedrockModel:
178
"""
179
AWS Bedrock model integration.
180
"""
181
def __init__(
182
self,
183
model_name: str,
184
*,
185
region: str | None = None,
186
aws_access_key_id: str | None = None,
187
aws_secret_access_key: str | None = None,
188
aws_session_token: str | None = None,
189
profile: str | None = None,
190
timeout: float | None = None
191
):
192
"""
193
Initialize Bedrock model.
194
195
Parameters:
196
- model_name: Bedrock model ID
197
- region: AWS region
198
- aws_access_key_id: AWS access key
199
- aws_secret_access_key: AWS secret key
200
- aws_session_token: AWS session token
201
- profile: AWS profile name
202
- timeout: Request timeout in seconds
203
"""
204
```
205
206
### Model Abstractions
207
208
Core model interface and utilities for working with models.
209
210
```python { .api }
211
class Model:
212
"""
213
Abstract model interface that all model implementations must follow.
214
"""
215
def name(self) -> str: ...
216
217
async def request(
218
self,
219
messages: list[ModelMessage],
220
model_settings: ModelSettings | None = None
221
) -> ModelResponse: ...
222
223
async def request_stream(
224
self,
225
messages: list[ModelMessage],
226
model_settings: ModelSettings | None = None
227
) -> StreamedResponse: ...
228
229
class StreamedResponse:
230
"""
231
Streamed model response for real-time processing.
232
"""
233
async def __aiter__(self) -> AsyncIterator[ModelResponseStreamEvent]: ...
234
235
async def get_final_response(self) -> ModelResponse: ...
236
237
class InstrumentedModel:
238
"""
239
Model wrapper with OpenTelemetry instrumentation.
240
"""
241
def __init__(
242
self,
243
model: Model,
244
settings: InstrumentationSettings
245
): ...
246
```
247
248
### Model Utilities
249
250
Helper functions for working with models and model names.
251
252
```python { .api }
253
def infer_model(model: Model | KnownModelName) -> Model:
254
"""
255
Infer model instance from string name or return existing model.
256
257
Parameters:
258
- model: Model instance or known model name string
259
260
Returns:
261
Model instance ready for use
262
"""
263
264
def instrument_model(
265
model: Model,
266
settings: InstrumentationSettings | None = None
267
) -> InstrumentedModel:
268
"""
269
Add OpenTelemetry instrumentation to a model.
270
271
Parameters:
272
- model: Model to instrument
273
- settings: Instrumentation configuration
274
275
Returns:
276
Instrumented model wrapper
277
"""
278
```
279
280
### Fallback Models
281
282
Model that automatically falls back to alternative models on failure.
283
284
```python { .api }
285
class FallbackModel:
286
"""
287
Model that falls back to alternative models on failure.
288
"""
289
def __init__(
290
self,
291
models: list[Model],
292
*,
293
max_retries: int = 3
294
):
295
"""
296
Initialize fallback model.
297
298
Parameters:
299
- models: List of models to try in order
300
- max_retries: Maximum retry attempts per model
301
"""
302
```
303
304
### Test Models
305
306
Models designed for testing and development.
307
308
```python { .api }
309
class TestModel:
310
"""
311
Test model implementation for testing and development.
312
"""
313
def __init__(
314
self,
315
*,
316
custom_result_text: str | None = None,
317
custom_result_tool_calls: list[ToolCallPart] | None = None,
318
custom_result_structured: Any = None
319
):
320
"""
321
Initialize test model with predefined responses.
322
323
Parameters:
324
- custom_result_text: Fixed text response
325
- custom_result_tool_calls: Fixed tool calls to make
326
- custom_result_structured: Fixed structured response
327
"""
328
329
class FunctionModel:
330
"""
331
Function-based model for custom logic during testing.
332
"""
333
def __init__(
334
self,
335
function: Callable[[list[ModelMessage]], ModelResponse | str],
336
*,
337
stream_function: Callable | None = None
338
):
339
"""
340
Initialize function model.
341
342
Parameters:
343
- function: Function that processes messages and returns response
344
- stream_function: Optional function for streaming responses
345
"""
346
```
347
348
### Known Model Names
349
350
Type alias for all supported model name strings.
351
352
```python { .api }
353
KnownModelName = Literal[
354
# OpenAI models
355
'gpt-4o',
356
'gpt-4o-mini',
357
'gpt-4-turbo',
358
'gpt-4',
359
'gpt-3.5-turbo',
360
'o1-preview',
361
'o1-mini',
362
363
# Anthropic models
364
'claude-3-5-sonnet-20241022',
365
'claude-3-5-haiku-20241022',
366
'claude-3-opus-20240229',
367
'claude-3-sonnet-20240229',
368
'claude-3-haiku-20240307',
369
370
# Google models
371
'gemini-1.5-pro',
372
'gemini-1.5-flash',
373
'gemini-1.0-pro',
374
375
# And many more...
376
]
377
```
378
379
## Usage Examples
380
381
### Basic Model Usage
382
383
```python
384
from pydantic_ai import Agent
385
from pydantic_ai.models import OpenAIModel, AnthropicModel
386
387
# Using OpenAI
388
openai_agent = Agent(
389
model=OpenAIModel('gpt-4'),
390
system_prompt='You are a helpful assistant.'
391
)
392
393
# Using Anthropic
394
anthropic_agent = Agent(
395
model=AnthropicModel('claude-3-5-sonnet-20241022'),
396
system_prompt='You are a helpful assistant.'
397
)
398
399
# Using model name directly (auto-inferred)
400
agent = Agent(
401
model='gpt-4',
402
system_prompt='You are a helpful assistant.'
403
)
404
```
405
406
### Model with Custom Configuration
407
408
```python
409
from pydantic_ai.models import OpenAIModel
410
411
# Custom OpenAI configuration
412
model = OpenAIModel(
413
'gpt-4',
414
api_key='your-api-key',
415
base_url='https://custom-endpoint.com/v1',
416
timeout=30.0
417
)
418
419
agent = Agent(model=model, system_prompt='Custom configured agent.')
420
```
421
422
### Fallback Model Configuration
423
424
```python
425
from pydantic_ai.models import FallbackModel, OpenAIModel, AnthropicModel
426
427
# Create fallback model that tries OpenAI first, then Anthropic
428
fallback_model = FallbackModel([
429
OpenAIModel('gpt-4'),
430
AnthropicModel('claude-3-5-sonnet-20241022')
431
])
432
433
agent = Agent(
434
model=fallback_model,
435
system_prompt='Resilient agent with fallback.'
436
)
437
```
438
439
### Testing with Test Models
440
441
```python
442
from pydantic_ai.models import TestModel
443
444
# Test model with fixed response
445
test_model = TestModel(custom_result_text='Fixed test response')
446
447
agent = Agent(model=test_model, system_prompt='Test agent.')
448
result = agent.run_sync('Any input')
449
print(result.data) # "Fixed test response"
450
```