0
# Optimizer Families
1
2
Parametrizable optimizer configurations that enable algorithm customization and automated hyperparameter tuning. Optimizer families provide factory patterns for creating specialized optimizer variants with configurable parameters and behavior.
3
4
## Capabilities
5
6
### Evolution Strategy Families
7
8
Configurable Evolution Strategy algorithms with customizable parameters for adaptation, selection, and mutation strategies.
9
10
```python { .api }
11
class ParametrizedOnePlusOne:
12
"""
13
Configurable (1+1) Evolution Strategy.
14
15
Parameters:
16
- noise_handling: Noise handling strategy
17
- mutation: Mutation type and parameters
18
- crossover: Crossover probability
19
- use_sphere: Use sphere mutation
20
"""
21
22
class EvolutionStrategy:
23
"""
24
Configurable Evolution Strategy family.
25
26
Parameters:
27
- recombination_weights: Recombination weight strategy
28
- popsize: Population size
29
- offsprings: Number of offspring
30
- ranker: Selection ranking method
31
"""
32
```
33
34
### CMA-ES Family
35
36
Covariance Matrix Adaptation Evolution Strategy with extensive configuration options for different problem characteristics and computational budgets.
37
38
```python { .api }
39
class ParametrizedCMA:
40
"""
41
Configurable CMA-ES with various parameters.
42
43
Parameters:
44
- scale: Coordinate scaling factor
45
- elitist: Use elitist strategy
46
- diagonal: Use diagonal adaptation
47
- fcmaes: Use fast CMA-ES
48
- popsize: Population size multiplier
49
- active: Use active covariance matrix adaptation
50
- random_init: Random initialization strategy
51
"""
52
```
53
54
### Differential Evolution Family
55
56
Configurable Differential Evolution algorithms with various mutation strategies, crossover types, and selection mechanisms.
57
58
```python { .api }
59
class DifferentialEvolution:
60
"""
61
Configurable Differential Evolution family.
62
63
Parameters:
64
- initialization: Population initialization method
65
- scale: Mutation scale factor (F parameter)
66
- crossover: Crossover probability (CR parameter)
67
- popsize: Population size
68
- strategy: DE mutation strategy
69
"""
70
```
71
72
### Bayesian Optimization Family
73
74
Configurable Bayesian Optimization with customizable acquisition functions, kernel choices, and optimization strategies.
75
76
```python { .api }
77
class ParametrizedBO:
78
"""
79
Configurable Bayesian Optimization.
80
81
Parameters:
82
- initialization: Initial design strategy
83
- middle_point: Use middle point initialization
84
- utility_kind: Acquisition function type
85
- utility_kappa: Exploration parameter
86
- utility_xi: Exploitation parameter
87
- gp_parameters: Gaussian process configuration
88
"""
89
90
class BayesOptim:
91
"""
92
Bayesian optimization configuration framework.
93
94
Parameters:
95
- random_state: Random state for reproducibility
96
- init_budget: Initial exploration budget
97
- middle_point: Middle point initialization
98
"""
99
```
100
101
### Meta-Model Families
102
103
Configurable surrogate model-based optimization with various model types and learning strategies.
104
105
```python { .api }
106
class ParametrizedMetaModel:
107
"""
108
Configurable metamodel optimization.
109
110
Parameters:
111
- model: Surrogate model type ("polynomial", "neural", "svm", "rf")
112
- acquisition: Acquisition strategy
113
- multivariate_optimizer: Underlying optimizer for metamodel
114
"""
115
```
116
117
### Sampling and Search Families
118
119
Configurable sampling-based search methods with various sequence types and initialization strategies.
120
121
```python { .api }
122
class RandomSearchMaker:
123
"""
124
Configurable random search variants.
125
126
Parameters:
127
- sampler: Sampling strategy
128
- scrambled: Use scrambled sequences
129
- opposition_mode: Opposition-based learning mode
130
- cauchy: Use Cauchy distribution
131
"""
132
133
class SamplingSearch:
134
"""
135
Configurable sampling-based search methods.
136
137
Parameters:
138
- sampler: Base sampling method
139
- scrambled: Scrambling strategy
140
- autorescale: Automatic rescaling
141
- opposition_mode: Opposition learning
142
"""
143
```
144
145
### Portfolio and Ensemble Families
146
147
Multi-algorithm families that combine multiple optimization strategies with configurable selection and execution patterns.
148
149
```python { .api }
150
class ConfPortfolio:
151
"""
152
Configured portfolio optimizer.
153
154
Parameters:
155
- optimizers: List of optimizer configurations
156
- weights: Selection weights for optimizers
157
- resampling: Resampling strategy
158
"""
159
160
class NonObjectOptimizer:
161
"""
162
Non-objective optimizer wrapper.
163
164
Parameters:
165
- method: Underlying optimization method
166
- random_state: Random state configuration
167
"""
168
```
169
170
### Sequential Optimization Families
171
172
Chaining frameworks that combine different algorithms sequentially with configurable transition criteria and resource allocation.
173
174
```python { .api }
175
class Chaining:
176
"""
177
Sequential optimizer chaining framework.
178
179
Parameters:
180
- optimizers: Sequence of optimizer configurations
181
- budgets: Budget allocation for each optimizer
182
- restart: Restart strategy between optimizers
183
"""
184
185
class NoisySplit:
186
"""
187
Noisy optimization with splitting strategies.
188
189
Parameters:
190
- num_optims: Number of parallel optimizers
191
- num_suggestions: Suggestions per optimizer
192
- discrete: Handle discrete variables
193
"""
194
```
195
196
### Particle Swarm Optimization Family
197
198
Configurable PSO algorithms with various topologies, parameter adaptation, and acceleration strategies.
199
200
```python { .api }
201
class ConfPSO:
202
"""
203
Configured Particle Swarm Optimization.
204
205
Parameters:
206
- popsize: Swarm size
207
- omega: Inertia weight
208
- phip: Cognitive acceleration coefficient
209
- phig: Social acceleration coefficient
210
- max_speed: Maximum particle velocity
211
"""
212
```
213
214
### Advanced Algorithm Families
215
216
Specialized algorithm families for specific optimization scenarios and problem characteristics.
217
218
```python { .api }
219
class ParametrizedTBPSA:
220
"""
221
Configurable TBPSA algorithm.
222
223
Parameters:
224
- naive: Use naive implementation
225
- initial_popsize: Initial population size
226
- max_offspring: Maximum offspring per generation
227
"""
228
229
class EMNA:
230
"""
231
Estimation of Multivariate Normal Algorithm.
232
233
Parameters:
234
- popsize: Population size
235
- sample_size: Sample size for distribution estimation
236
- naive: Use naive implementation
237
"""
238
239
class ConfSplitOptimizer:
240
"""
241
Configured split optimization strategies.
242
243
Parameters:
244
- num_optims: Number of sub-optimizers
245
- progressive: Progressive resource allocation
246
- non_deterministic_descriptor: Non-deterministic optimization
247
"""
248
```
249
250
### External Integration Families
251
252
Configuration frameworks for integrating external optimization libraries with nevergrad's interface.
253
254
```python { .api }
255
class Pymoo:
256
"""
257
Pymoo integration family.
258
259
Parameters:
260
- algorithm: Pymoo algorithm name
261
- termination: Termination criteria
262
- save_history: Save optimization history
263
"""
264
```
265
266
## Usage Examples
267
268
### Creating Custom CMA-ES Variants
269
270
```python
271
import nevergrad as ng
272
273
# Create custom CMA-ES configuration
274
custom_cma = ng.families.ParametrizedCMA(
275
scale=1.0,
276
elitist=True,
277
diagonal=False,
278
popsize=lambda dim: 4 + int(3 * np.log(dim))
279
)
280
281
# Use with parametrization
282
param = ng.p.Array(shape=(10,))
283
optimizer = custom_cma(parametrization=param, budget=200)
284
```
285
286
### Configuring Differential Evolution
287
288
```python
289
# Custom DE with specific parameters
290
custom_de = ng.families.DifferentialEvolution(
291
initialization="LHS", # Latin Hypercube Sampling
292
scale=0.8, # Mutation scale factor
293
crossover=0.9, # Crossover probability
294
popsize=50,
295
strategy="DE/rand/1"
296
)
297
298
optimizer = custom_de(parametrization=param, budget=100)
299
```
300
301
### Creating Algorithm Chains
302
303
```python
304
# Sequential optimization: start with random search, then CMA-ES
305
chain = ng.families.Chaining([
306
ng.families.RandomSearchMaker(),
307
ng.families.ParametrizedCMA(diagonal=True)
308
], budgets=[50, 150]) # 50 evaluations for random, 150 for CMA
309
310
optimizer = chain(parametrization=param, budget=200)
311
```
312
313
### Bayesian Optimization Configuration
314
315
```python
316
# Custom Bayesian optimization setup
317
custom_bo = ng.families.ParametrizedBO(
318
initialization="Hammersley", # Hammersley sequence initialization
319
utility_kind="ucb", # Upper Confidence Bound acquisition
320
utility_kappa=2.576, # 99% confidence level
321
middle_point=True
322
)
323
324
optimizer = custom_bo(parametrization=param, budget=100)
325
```
326
327
### Portfolio with Multiple Algorithms
328
329
```python
330
# Create portfolio of different optimizers
331
portfolio = ng.families.ConfPortfolio(
332
optimizers=[
333
ng.families.ParametrizedCMA(diagonal=True),
334
ng.families.DifferentialEvolution(scale=0.5),
335
ng.families.ParametrizedBO()
336
],
337
weights=[0.4, 0.4, 0.2] # Allocation weights
338
)
339
340
optimizer = portfolio(parametrization=param, budget=300)
341
```
342
343
### Meta-Model Configuration
344
345
```python
346
# Neural network metamodel with CMA-ES backend
347
metamodel = ng.families.ParametrizedMetaModel(
348
model="neural",
349
acquisition="improvement",
350
multivariate_optimizer=ng.families.ParametrizedCMA()
351
)
352
353
optimizer = metamodel(parametrization=param, budget=150)
354
```
355
356
### Custom Sampling Strategy
357
358
```python
359
# Quasi-random sampling with opposition learning
360
sampling = ng.families.SamplingSearch(
361
sampler="Halton",
362
scrambled=True,
363
opposition_mode="opposite",
364
autorescale=True
365
)
366
367
optimizer = sampling(parametrization=param, budget=100)
368
```