CLI utility and Python library for interacting with Large Language Models from multiple providers including OpenAI, Anthropic, Google, and Meta plus locally installed models.
npx @tessl/cli install tessl/pypi-llm@0.27.00
# LLM
1
2
A comprehensive CLI utility and Python library for interacting with Large Language Models from multiple providers including OpenAI, Anthropic's Claude, Google's Gemini, and Meta's Llama, supporting both remote APIs and locally installed models. The package provides extensive functionality for executing prompts, storing conversations, generating embeddings, extracting structured content, and enabling models to execute tools through an extensible plugin system.
3
4
## Package Information
5
6
- **Package Name**: llm
7
- **Language**: Python
8
- **Installation**: `pip install llm`
9
- **Documentation**: https://llm.datasette.io/
10
- **Repository**: https://github.com/simonw/llm
11
12
## Core Imports
13
14
```python
15
import llm
16
```
17
18
Common imports for library usage:
19
20
```python
21
from llm import get_model, get_async_model, Conversation, Response
22
from llm import AsyncModel, EmbeddingModel
23
```
24
25
For embedding operations:
26
27
```python
28
from llm import Collection, get_embedding_model
29
```
30
31
For tools and templates:
32
33
```python
34
from llm import Tool, Toolbox, Template
35
```
36
37
## Basic Usage
38
39
### Simple Model Interaction
40
41
```python
42
import llm
43
44
# Get a model (defaults to gpt-4o-mini)
45
model = llm.get_model()
46
47
# Send a prompt and get response
48
response = model.prompt("What is the capital of France?")
49
print(response.text())
50
```
51
52
### Conversation with Tools
53
54
```python
55
import llm
56
57
# Define a simple tool
58
def get_weather(location: str) -> str:
59
"""Get weather information for a location."""
60
return f"The weather in {location} is sunny and 75°F"
61
62
# Create tool and model
63
weather_tool = llm.Tool.function(get_weather)
64
model = llm.get_model("gpt-4")
65
66
# Have conversation with tool access
67
conversation = model.conversation()
68
response = conversation.prompt("What's the weather like in Paris?", tools=[weather_tool])
69
print(response.text())
70
```
71
72
### Working with Embeddings
73
74
```python
75
import llm
76
77
# Get embedding model and create collection
78
embedding_model = llm.get_embedding_model("ada-002")
79
collection = llm.Collection("documents", embedding_model)
80
81
# Add documents
82
collection.embed("doc1", "Paris is the capital of France")
83
collection.embed("doc2", "London is the capital of England")
84
85
# Find similar documents
86
results = collection.similar("French capital city")
87
for entry in results:
88
print(f"{entry.id}: {entry.content} (score: {entry.score})")
89
```
90
91
## Architecture
92
93
The LLM package is built around several key architectural components:
94
95
### Model Hierarchy
96
- **Model Classes**: Abstract base classes for sync (Model, KeyModel) and async (AsyncModel, AsyncKeyModel) implementations
97
- **Discovery System**: Plugin-based model registration and alias management
98
- **Response System**: Streaming and non-streaming response handling with tool integration
99
100
### Conversation Management
101
- **Conversation Objects**: Stateful conversation containers that maintain history and context
102
- **Prompt System**: Rich prompt objects supporting attachments, tools, and structured schemas
103
- **Tool Integration**: Function calling with automatic schema generation and chaining
104
105
### Plugin Ecosystem
106
- **Hook System**: Pluggy-based hook specifications for extensibility
107
- **Model Registration**: Plugin-provided models and embedding models
108
- **Tool Registration**: Plugin-provided tools and toolboxes
109
- **Template/Fragment Loaders**: Plugin-provided content processors
110
111
### Data Storage
112
- **SQLite Backend**: Conversation history, embeddings, and metadata storage
113
- **Configuration Management**: User directory with keys, aliases, and settings
114
- **Migration System**: Database schema evolution support
115
116
This architecture enables the package to serve as both a standalone CLI tool and a comprehensive Python library while maintaining extensibility through its plugin system.
117
118
## Capabilities
119
120
### Models and Conversations
121
122
Core model management, conversation handling, prompt processing, and response streaming. Supports both synchronous and asynchronous operations with comprehensive error handling.
123
124
```python { .api }
125
def get_model(name: Optional[str] = None) -> Model
126
def get_async_model(name: Optional[str] = None) -> AsyncModel
127
class Conversation:
128
def prompt(self, prompt, **kwargs) -> Response
129
class Response:
130
def text() -> str
131
def __iter__() -> Iterator[str]
132
```
133
134
[Models and Conversations](./models-and-conversations.md)
135
136
### Tools and Toolboxes
137
138
Function calling system with automatic schema generation, tool chaining, and error handling. Supports both individual tools and organized toolbox collections.
139
140
```python { .api }
141
class Tool:
142
@classmethod
143
def function(cls, function, name=None, description=None) -> Tool
144
class Toolbox:
145
def tools(self) -> Iterable[Tool]
146
class ToolCall:
147
function: str
148
arguments: dict
149
```
150
151
[Tools and Toolboxes](./tools-and-toolboxes.md)
152
153
### Embeddings and Vector Operations
154
155
Vector database operations with similarity search, metadata storage, and efficient batch processing. Supports multiple embedding models and custom similarity metrics.
156
157
```python { .api }
158
class Collection:
159
def embed(self, id: str, value: Union[str, bytes], metadata=None, store=False)
160
def similar(self, value, number=10) -> List[Entry]
161
def similar_by_id(self, id: str, number=10) -> List[Entry]
162
def cosine_similarity(a: List[float], b: List[float]) -> float
163
```
164
165
[Embeddings](./embeddings.md)
166
167
### Templates and Fragments
168
169
Prompt template system with variable substitution, attachment handling, and fragment management for reusable prompt components.
170
171
```python { .api }
172
class Template:
173
def evaluate(self, input: str, params=None) -> Tuple[Optional[str], Optional[str]]
174
def vars(self) -> set
175
class Fragment(str):
176
source: str
177
def id(self) -> str
178
```
179
180
[Templates](./templates.md)
181
182
### Configuration and Key Management
183
184
User configuration directory management, API key storage and retrieval, model aliases, and default settings with environment variable support.
185
186
```python { .api }
187
def user_dir() -> pathlib.Path
188
def get_key(input=None, alias=None, env=None) -> Optional[str]
189
def set_alias(alias: str, model_id: str)
190
def get_default_model() -> str
191
def set_default_model(model: str)
192
```
193
194
[Configuration](./configuration.md)
195
196
### Plugin System
197
198
Extensible plugin architecture with hook specifications for registering models, tools, templates, and commands. Enables third-party extensions and custom integrations.
199
200
```python { .api }
201
def get_plugins(all=False) -> List[dict]
202
def get_tools() -> Dict[str, Union[Tool, Type[Toolbox]]]
203
@hookimpl
204
def register_models(register): ...
205
```
206
207
[Plugins](./plugins.md)
208
209
## Error Handling
210
211
The package provides comprehensive error handling through specialized exception classes:
212
213
```python { .api }
214
class ModelError(Exception): ...
215
class NeedsKeyException(ModelError): ...
216
class UnknownModelError(KeyError): ...
217
class CancelToolCall(Exception): ...
218
```
219
220
Common error scenarios include missing API keys, unknown model names, tool execution cancellation, and network connectivity issues.
221
222
## CLI Interface
223
224
The package includes a full-featured command-line interface accessible via the `llm` command:
225
226
```bash
227
# Basic usage
228
llm "What is the capital of France?"
229
230
# With specific model
231
llm -m claude-3-sonnet "Explain quantum computing"
232
233
# Interactive conversation
234
llm chat
235
236
# Manage models and keys
237
llm models list
238
llm keys set openai
239
240
# Embedding operations
241
llm embed "Some text" | llm similar
242
```
243
244
The CLI provides access to all library functionality including model management, conversation handling, tool execution, embedding operations, and plugin management.
245
246
## Constants and Utilities
247
248
```python { .api }
249
DEFAULT_MODEL = "gpt-4o-mini"
250
251
def encode(values: List[float]) -> bytes
252
def decode(binary: bytes) -> List[float]
253
def schema_dsl(schema_dsl: str) -> Dict[str, Any]
254
```
255
256
The package includes utility functions for vector encoding/decoding, schema generation from DSL strings, and various text processing operations.