0
# Advanced Components
1
2
Lower-level components for building custom optimization workflows. These components provide fine-grained control over the optimization process and can be combined to create specialized optimization strategies.
3
4
## Capabilities
5
6
### Surrogate Models
7
8
Base interface and concrete implementations for learning from trial history to predict configuration performance.
9
10
```python { .api }
11
class AbstractModel:
12
def __init__(
13
self,
14
configspace: ConfigurationSpace,
15
instance_features: dict[str, list[int | float]] | None = None,
16
pca_components: int | None = 7,
17
seed: int = 0
18
):
19
"""
20
Base surrogate model interface.
21
22
Parameters:
23
- configspace: Configuration space definition
24
- instance_features: Features for problem instances
25
- pca_components: PCA dimensionality reduction components
26
- seed: Random seed for reproducibility
27
"""
28
29
def train(self, X: np.ndarray, Y: np.ndarray) -> AbstractModel:
30
"""
31
Train model on historical data.
32
33
Parameters:
34
- X: Input configurations as feature matrix
35
- Y: Target values (costs/objectives)
36
37
Returns:
38
- Self for method chaining
39
"""
40
41
def predict(
42
self,
43
X: np.ndarray,
44
covariance_type: str | None = "diagonal"
45
) -> tuple[np.ndarray, np.ndarray | None]:
46
"""
47
Make predictions for new configurations.
48
49
Parameters:
50
- X: Configurations to predict
51
- covariance_type: Type of uncertainty estimation
52
53
Returns:
54
- (mean_predictions, uncertainty_estimates)
55
"""
56
57
def predict_marginalized(
58
self,
59
X: np.ndarray
60
) -> tuple[np.ndarray, np.ndarray | None]:
61
"""
62
Predict averaging over problem instances.
63
64
Parameters:
65
- X: Configurations to predict
66
67
Returns:
68
- (mean_predictions, uncertainty_estimates)
69
"""
70
71
@property
72
def meta(self) -> dict[str, Any]:
73
"""Model metadata and hyperparameters."""
74
```
75
76
Concrete model implementations:
77
78
```python { .api }
79
class RandomModel(AbstractModel):
80
"""Random baseline model that returns random predictions."""
81
82
class GaussianProcess(AbstractModel):
83
"""Gaussian process model for continuous optimization."""
84
85
class MCMCGaussianProcess(AbstractModel):
86
"""MCMC-based Gaussian process with integrated uncertainty."""
87
88
class RandomForest(AbstractModel):
89
"""Random forest model for mixed variable spaces."""
90
91
class MultiObjectiveModel(AbstractModel):
92
"""Wrapper for handling multiple objectives."""
93
```
94
95
### Acquisition Functions
96
97
Functions that determine which configuration to evaluate next based on surrogate model predictions.
98
99
```python { .api }
100
class AbstractAcquisitionFunction:
101
def __call__(self, configurations: np.ndarray) -> np.ndarray:
102
"""
103
Evaluate acquisition function for configurations.
104
105
Parameters:
106
- configurations: Configurations to evaluate
107
108
Returns:
109
- Acquisition values (higher = more promising)
110
"""
111
112
def update(self, model: AbstractModel, **kwargs) -> None:
113
"""
114
Update acquisition function after model training.
115
116
Parameters:
117
- model: Updated surrogate model
118
- **kwargs: Additional update parameters
119
"""
120
121
@property
122
def name(self) -> str:
123
"""Function name."""
124
125
@property
126
def model(self) -> AbstractModel | None:
127
"""Associated surrogate model."""
128
129
@property
130
def meta(self) -> dict[str, Any]:
131
"""Function metadata."""
132
```
133
134
Concrete acquisition functions:
135
136
```python { .api }
137
class EI(AbstractAcquisitionFunction):
138
"""Expected Improvement acquisition function."""
139
140
class EIPS(AbstractAcquisitionFunction):
141
"""Expected Improvement Per Second (for runtime optimization)."""
142
143
class PI(AbstractAcquisitionFunction):
144
"""Probability of Improvement acquisition function."""
145
146
class LCB(AbstractAcquisitionFunction):
147
"""Lower Confidence Bound acquisition function."""
148
149
class TS(AbstractAcquisitionFunction):
150
"""Thompson Sampling acquisition function."""
151
152
class PriorAcquisitionFunction(AbstractAcquisitionFunction):
153
"""Prior-based acquisition for initial exploration."""
154
155
class IntegratedAcquisitionFunction(AbstractAcquisitionFunction):
156
"""MCMC integrated acquisition function."""
157
```
158
159
### Acquisition Maximizers
160
161
Optimization strategies for finding configurations that maximize acquisition functions.
162
163
```python { .api }
164
class AbstractAcquisitionMaximizer:
165
def maximize(
166
self,
167
acquisition_function: AbstractAcquisitionFunction,
168
history: RunHistory,
169
num_points: int
170
) -> list[Configuration]:
171
"""
172
Find configurations that maximize acquisition function.
173
174
Parameters:
175
- acquisition_function: Function to maximize
176
- history: Historical trial data
177
- num_points: Number of configurations to return
178
179
Returns:
180
- List of promising configurations
181
"""
182
```
183
184
Concrete maximizers:
185
186
```python { .api }
187
class RandomSearch(AbstractAcquisitionMaximizer):
188
"""Random sampling acquisition maximizer."""
189
190
class LocalSearch(AbstractAcquisitionMaximizer):
191
"""Local optimization acquisition maximizer."""
192
193
class DifferentialEvolution(AbstractAcquisitionMaximizer):
194
"""Evolutionary algorithm acquisition maximizer."""
195
196
class LocalAndSortedRandomSearch(AbstractAcquisitionMaximizer):
197
"""Hybrid random/local search acquisition maximizer."""
198
```
199
200
### Initial Designs
201
202
Sampling strategies for initial exploration before surrogate model training.
203
204
```python { .api }
205
class AbstractInitialDesign:
206
def select_initial_configurations(
207
self,
208
num_configs: int,
209
additional_configs: list[Configuration] | None = None
210
) -> list[Configuration]:
211
"""
212
Generate initial configurations for exploration.
213
214
Parameters:
215
- num_configs: Number of configurations to generate
216
- additional_configs: User-provided configurations to include
217
218
Returns:
219
- List of initial configurations
220
"""
221
```
222
223
Concrete initial designs:
224
225
```python { .api }
226
class RandomInitialDesign(AbstractInitialDesign):
227
"""Random sampling initial design."""
228
229
class LatinHypercubeInitialDesign(AbstractInitialDesign):
230
"""Latin hypercube sampling for space-filling design."""
231
232
class SobolInitialDesign(AbstractInitialDesign):
233
"""Sobol sequence sampling for low-discrepancy coverage."""
234
235
class FactorialInitialDesign(AbstractInitialDesign):
236
"""Factorial design for systematic parameter exploration."""
237
238
class DefaultInitialDesign(AbstractInitialDesign):
239
"""Default configuration only (no random exploration)."""
240
```
241
242
### Intensifiers
243
244
Strategies for comparing configurations and managing multi-fidelity budget allocation.
245
246
```python { .api }
247
class AbstractIntensifier:
248
def get_next_trial(self, incumbent: Configuration | None = None) -> TrialInfo:
249
"""
250
Get next trial to execute.
251
252
Parameters:
253
- incumbent: Current best configuration
254
255
Returns:
256
- Trial information for next evaluation
257
"""
258
259
def update_incumbents(self, trials: list[tuple[TrialInfo, TrialValue]]) -> None:
260
"""
261
Update incumbent configurations based on new trial results.
262
263
Parameters:
264
- trials: List of (trial_info, trial_value) pairs
265
"""
266
```
267
268
Concrete intensifiers:
269
270
```python { .api }
271
class Intensifier(AbstractIntensifier):
272
"""Basic racing intensifier for configuration comparison."""
273
274
class SuccessiveHalving(AbstractIntensifier):
275
"""Multi-fidelity successive halving intensifier."""
276
277
class Hyperband(AbstractIntensifier):
278
"""Multi-fidelity Hyperband intensifier with multiple brackets."""
279
```
280
281
### Multi-Objective Algorithms
282
283
Strategies for handling multiple optimization objectives through scalarization.
284
285
```python { .api }
286
class AbstractMultiObjectiveAlgorithm:
287
def scalarize(self, objectives: list[float]) -> float:
288
"""
289
Convert multiple objectives to single scalar value.
290
291
Parameters:
292
- objectives: List of objective values
293
294
Returns:
295
- Scalarized value for comparison
296
"""
297
```
298
299
Concrete multi-objective algorithms:
300
301
```python { .api }
302
class MeanAggregationStrategy(AbstractMultiObjectiveAlgorithm):
303
"""Weighted sum scalarization strategy."""
304
305
class ParEGO(AbstractMultiObjectiveAlgorithm):
306
"""ParEGO scalarization with random weight vectors."""
307
```
308
309
### Random Designs
310
311
Strategies for introducing randomness during optimization to maintain exploration.
312
313
```python { .api }
314
class AbstractRandomDesign:
315
def should_use_random_design(self, iteration: int) -> bool:
316
"""
317
Determine whether to use random sampling instead of model-based selection.
318
319
Parameters:
320
- iteration: Current optimization iteration
321
322
Returns:
323
- True if random design should be used
324
"""
325
```
326
327
Concrete random designs:
328
329
```python { .api }
330
class ProbabilityRandomDesign(AbstractRandomDesign):
331
"""Fixed probability random sampling."""
332
333
class ModulusRandomDesign(AbstractRandomDesign):
334
"""Modulus-based random sampling schedule."""
335
336
class CosineAnnealingRandomDesign(AbstractRandomDesign):
337
"""Cosine annealing schedule for random sampling."""
338
```
339
340
## Custom Optimization Example
341
342
```python
343
from smac import AbstractFacade, Scenario
344
from smac.model import RandomForest
345
from smac.acquisition.function import EI
346
from smac.acquisition.maximizer import LocalAndSortedRandomSearch
347
from smac.initial_design import LatinHypercubeInitialDesign
348
from smac.intensifier import Intensifier
349
from smac.multi_objective import MeanAggregationStrategy
350
351
class CustomFacade(AbstractFacade):
352
@staticmethod
353
def get_model(scenario, **kwargs):
354
return RandomForest(
355
configspace=scenario.configspace,
356
n_trees=20, # Custom: more trees
357
max_depth=15, # Custom: shallower trees
358
**kwargs
359
)
360
361
@staticmethod
362
def get_acquisition_function(scenario, **kwargs):
363
return EI(xi=0.01) # Custom: lower exploration parameter
364
365
@staticmethod
366
def get_acquisition_maximizer(scenario, **kwargs):
367
return LocalAndSortedRandomSearch(
368
configspace=scenario.configspace,
369
challengers=5000, # Custom: more challengers
370
local_search_iterations=20 # Custom: more local search
371
)
372
373
@staticmethod
374
def get_initial_design(scenario, **kwargs):
375
return LatinHypercubeInitialDesign(
376
configspace=scenario.configspace,
377
n_configs=50, # Custom: more initial samples
378
**kwargs
379
)
380
381
@staticmethod
382
def get_intensifier(scenario, **kwargs):
383
return Intensifier(
384
scenario=scenario,
385
max_config_calls=5, # Custom: more evaluations per config
386
**kwargs
387
)
388
389
# Use custom facade
390
scenario = Scenario(configspace=config_space, n_trials=100)
391
custom_facade = CustomFacade(scenario, objective)
392
result = custom_facade.optimize()
393
```
394
395
## Constants
396
397
Package-wide constants used throughout SMAC3.
398
399
```python { .api }
400
# Maximum integer value
401
MAXINT = 2**31 - 1
402
403
# Minimum cost for logarithmic scaling
404
MINIMAL_COST_FOR_LOG = 0.00001
405
406
# Maximum cutoff value
407
MAX_CUTOFF = 65535
408
409
# Numerical epsilon for computations
410
VERY_SMALL_NUMBER = 1e-10
411
412
# Default number of trees for random forests
413
N_TREES = 10
414
```