0
# Configuration
1
2
User configuration directory management, API key storage and retrieval, model aliases, and default settings with environment variable support. This module provides comprehensive configuration management for the LLM package.
3
4
## Capabilities
5
6
### User Directory Management
7
8
The package maintains a user-specific configuration directory for storing settings, keys, and data.
9
10
```python { .api }
11
def user_dir() -> pathlib.Path:
12
"""
13
Get user configuration directory path.
14
15
Returns:
16
Path to user configuration directory. Creates directory if it doesn't exist.
17
18
Notes:
19
- Uses LLM_USER_PATH environment variable if set
20
- Otherwise uses platform-specific app directory
21
- Directory is created automatically if it doesn't exist
22
"""
23
```
24
25
### API Key Management
26
27
Comprehensive system for storing, retrieving, and managing API keys from multiple sources.
28
29
```python { .api }
30
def get_key(
31
input: Optional[str] = None,
32
*,
33
alias: Optional[str] = None,
34
env: Optional[str] = None
35
) -> Optional[str]:
36
"""
37
Retrieve API key from various sources with fallback hierarchy.
38
39
Args:
40
input: Direct key value or alias name to lookup
41
alias: Key alias to look up in stored keys
42
env: Environment variable name to check
43
44
Returns:
45
API key string or None if not found
46
47
Notes:
48
Key resolution order:
49
1. If input matches a stored alias, return that key
50
2. If input is provided and not an alias, return input directly
51
3. If alias matches a stored key, return that key
52
4. If env variable is set, return its value
53
5. Return None if no key found
54
"""
55
56
def load_keys() -> dict:
57
"""
58
Load stored API keys from configuration file.
59
60
Returns:
61
Dictionary of alias -> key mappings from keys.json
62
"""
63
```
64
65
### Model Aliases
66
67
System for creating and managing model name aliases for convenience and consistency.
68
69
```python { .api }
70
def set_alias(alias: str, model_id_or_alias: str):
71
"""
72
Set an alias to point to a specific model.
73
74
Args:
75
alias: Alias name to create
76
model_id_or_alias: Model ID or existing alias to point to
77
78
Notes:
79
- Resolves model_id_or_alias to actual model ID if possible
80
- Falls back to exact string if model not found
81
- Saves alias to aliases.json in user directory
82
"""
83
84
def remove_alias(alias: str):
85
"""
86
Remove an existing alias.
87
88
Args:
89
alias: Alias name to remove
90
91
Raises:
92
KeyError: If alias doesn't exist or aliases.json is invalid
93
"""
94
```
95
96
### Default Model Configuration
97
98
Functions to manage default model settings for various operations.
99
100
```python { .api }
101
def get_default_model() -> str:
102
"""
103
Get the current default model name.
104
105
Returns:
106
Default model name or DEFAULT_MODEL constant if not configured
107
"""
108
109
def set_default_model(model: Optional[str]):
110
"""
111
Set the default model.
112
113
Args:
114
model: Model name to set as default, or None to clear
115
116
Notes:
117
- Setting to None removes the default_model.txt file
118
- Uses DEFAULT_MODEL constant when no default is set
119
"""
120
121
def get_default_embedding_model() -> Optional[str]:
122
"""
123
Get the current default embedding model name.
124
125
Returns:
126
Default embedding model name or None if not configured
127
"""
128
129
def set_default_embedding_model(model: Optional[str]):
130
"""
131
Set the default embedding model.
132
133
Args:
134
model: Embedding model name to set as default, or None to clear
135
"""
136
```
137
138
## Configuration Files
139
140
The user directory contains several configuration files:
141
142
### keys.json
143
Stores API keys mapped to aliases:
144
```json
145
{
146
"openai": "sk-...",
147
"anthropic": "sk-ant-...",
148
"my_key": "custom-key-value"
149
}
150
```
151
152
### aliases.json
153
Maps model aliases to actual model IDs:
154
```json
155
{
156
"fast": "gpt-3.5-turbo",
157
"smart": "gpt-4",
158
"claude": "claude-3-sonnet-20240229"
159
}
160
```
161
162
### default_model.txt
163
Contains the name of the default model:
164
```
165
gpt-4
166
```
167
168
### default_embedding_model.txt
169
Contains the name of the default embedding model:
170
```
171
text-embedding-ada-002
172
```
173
174
## Usage Examples
175
176
### Basic Directory Operations
177
178
```python
179
import llm
180
181
# Get user configuration directory
182
config_dir = llm.user_dir()
183
print(f"Configuration directory: {config_dir}")
184
185
# Check what files exist
186
import os
187
config_files = os.listdir(config_dir)
188
print(f"Configuration files: {config_files}")
189
190
# Directory is created automatically if it doesn't exist
191
print(f"Directory exists: {config_dir.exists()}")
192
```
193
194
### API Key Management
195
196
```python
197
import llm
198
199
# Load all stored keys
200
keys = llm.load_keys()
201
print(f"Stored key aliases: {list(keys.keys())}")
202
203
# Get key by alias (from keys.json)
204
openai_key = llm.get_key(alias="openai")
205
if openai_key:
206
print("OpenAI key found")
207
else:
208
print("OpenAI key not configured")
209
210
# Get key from environment variable
211
env_key = llm.get_key(env="OPENAI_API_KEY")
212
if env_key:
213
print("Key found in environment")
214
215
# Hierarchy: direct input -> alias -> env variable
216
api_key = llm.get_key(
217
input="my_openai_key", # First try as alias
218
alias="openai", # Fallback to this alias
219
env="OPENAI_API_KEY" # Final fallback to env var
220
)
221
222
# Using with model that needs a key
223
if api_key:
224
model = llm.get_model("gpt-4")
225
# Key is used automatically by KeyModel implementations
226
```
227
228
### Model Aliases
229
230
```python
231
import llm
232
233
# Set up convenient aliases
234
llm.set_alias("fast", "gpt-3.5-turbo")
235
llm.set_alias("smart", "gpt-4")
236
llm.set_alias("claude", "claude-3-sonnet-20240229")
237
238
# Use aliases instead of full model names
239
fast_model = llm.get_model("fast")
240
smart_model = llm.get_model("smart")
241
242
print(f"Fast model: {fast_model.model_id}")
243
print(f"Smart model: {smart_model.model_id}")
244
245
# Set alias to another alias (chains)
246
llm.set_alias("default", "smart")
247
default_model = llm.get_model("default")
248
print(f"Default model: {default_model.model_id}")
249
250
# Remove alias when no longer needed
251
llm.remove_alias("default")
252
```
253
254
### Default Model Configuration
255
256
```python
257
import llm
258
259
# Check current default
260
current_default = llm.get_default_model()
261
print(f"Current default model: {current_default}")
262
263
# Set new default
264
llm.set_default_model("gpt-4")
265
print(f"New default: {llm.get_default_model()}")
266
267
# Use default model (no name specified)
268
model = llm.get_model() # Uses default automatically
269
print(f"Got model: {model.model_id}")
270
271
# Clear default (reverts to DEFAULT_MODEL constant)
272
llm.set_default_model(None)
273
print(f"After clearing: {llm.get_default_model()}")
274
275
# Embedding model defaults work similarly
276
llm.set_default_embedding_model("text-embedding-ada-002")
277
embedding_model = llm.get_embedding_model() # Uses default if available
278
```
279
280
### Environment Variable Integration
281
282
```python
283
import llm
284
import os
285
286
# Set environment variables for testing
287
os.environ["LLM_USER_PATH"] = "/custom/config/path"
288
os.environ["OPENAI_API_KEY"] = "sk-test-key-from-env"
289
290
# Custom user directory
291
custom_dir = llm.user_dir()
292
print(f"Custom config directory: {custom_dir}")
293
294
# Key from environment
295
env_key = llm.get_key(env="OPENAI_API_KEY")
296
print(f"Key from environment: {env_key[:10]}...")
297
298
# Hierarchical key resolution
299
# Try alias first, fall back to environment
300
key = llm.get_key(alias="openai", env="OPENAI_API_KEY")
301
print("Using hierarchical key resolution")
302
```
303
304
### Configuration File Management
305
306
```python
307
import llm
308
import json
309
310
# Manually inspect configuration files
311
config_dir = llm.user_dir()
312
313
# Check aliases
314
aliases_file = config_dir / "aliases.json"
315
if aliases_file.exists():
316
with open(aliases_file) as f:
317
aliases = json.load(f)
318
print(f"Current aliases: {aliases}")
319
320
# Check keys (be careful with security)
321
keys_file = config_dir / "keys.json"
322
if keys_file.exists():
323
keys = llm.load_keys()
324
# Don't print actual keys!
325
print(f"Configured key aliases: {list(keys.keys())}")
326
327
# Check default models
328
default_file = config_dir / "default_model.txt"
329
if default_file.exists():
330
with open(default_file) as f:
331
default_model = f.read().strip()
332
print(f"Default model from file: {default_model}")
333
```
334
335
### Advanced Configuration Patterns
336
337
```python
338
import llm
339
340
# Set up development vs production configurations
341
def setup_dev_config():
342
"""Configure for development environment."""
343
llm.set_alias("dev_chat", "gpt-3.5-turbo")
344
llm.set_alias("dev_smart", "gpt-4")
345
llm.set_default_model("dev_chat")
346
llm.set_default_embedding_model("text-embedding-ada-002")
347
348
def setup_prod_config():
349
"""Configure for production environment."""
350
llm.set_alias("prod_chat", "gpt-4")
351
llm.set_alias("prod_embeddings", "text-embedding-3-large")
352
llm.set_default_model("prod_chat")
353
llm.set_default_embedding_model("prod_embeddings")
354
355
# Environment-based configuration
356
import os
357
if os.getenv("ENV") == "production":
358
setup_prod_config()
359
else:
360
setup_dev_config()
361
362
# Use environment-appropriate defaults
363
model = llm.get_model() # Gets appropriate default
364
embedding_model = llm.get_embedding_model() # Gets appropriate default
365
```
366
367
### Configuration Validation
368
369
```python
370
import llm
371
372
def validate_configuration():
373
"""Validate current configuration setup."""
374
issues = []
375
376
# Check if user directory is accessible
377
try:
378
config_dir = llm.user_dir()
379
if not config_dir.exists():
380
issues.append("Configuration directory not found")
381
except Exception as e:
382
issues.append(f"Cannot access configuration directory: {e}")
383
384
# Check for required keys
385
required_keys = ["openai", "anthropic"]
386
keys = llm.load_keys()
387
388
for required_key in required_keys:
389
if required_key not in keys:
390
issues.append(f"Missing required key alias: {required_key}")
391
392
# Check default model is valid
393
try:
394
default_model = llm.get_model()
395
print(f"Default model OK: {default_model.model_id}")
396
except Exception as e:
397
issues.append(f"Default model invalid: {e}")
398
399
# Report issues
400
if issues:
401
print("Configuration issues found:")
402
for issue in issues:
403
print(f"- {issue}")
404
else:
405
print("Configuration validation passed")
406
407
validate_configuration()
408
```
409
410
### Migration and Backup
411
412
```python
413
import llm
414
import shutil
415
import json
416
from datetime import datetime
417
418
def backup_configuration():
419
"""Create backup of configuration directory."""
420
config_dir = llm.user_dir()
421
timestamp = datetime.now().strftime("%Y%m%d_%H%M%S")
422
backup_dir = config_dir.parent / f"llm_config_backup_{timestamp}"
423
424
shutil.copytree(config_dir, backup_dir)
425
print(f"Configuration backed up to: {backup_dir}")
426
return backup_dir
427
428
def export_config():
429
"""Export configuration to a portable format."""
430
config_dir = llm.user_dir()
431
432
export_data = {
433
"aliases": {},
434
"default_model": None,
435
"default_embedding_model": None,
436
"export_timestamp": datetime.now().isoformat()
437
}
438
439
# Export aliases (safe to share)
440
aliases_file = config_dir / "aliases.json"
441
if aliases_file.exists():
442
with open(aliases_file) as f:
443
export_data["aliases"] = json.load(f)
444
445
# Export defaults
446
export_data["default_model"] = llm.get_default_model()
447
export_data["default_embedding_model"] = llm.get_default_embedding_model()
448
449
# Note: Don't export keys for security
450
print("Configuration exported (keys excluded for security)")
451
return export_data
452
453
# Create backup before major changes
454
backup_path = backup_configuration()
455
456
# Export shareable configuration
457
config_export = export_config()
458
print(json.dumps(config_export, indent=2))
459
```
460
461
This comprehensive configuration system provides secure, flexible management of API keys, model preferences, and user settings while supporting both programmatic and file-based configuration approaches.