0
# Optimization Algorithms
1
2
Comprehensive collection of 368+ gradient-free optimization algorithms with unified interface. Includes Evolution Strategies, Differential Evolution, Particle Swarm Optimization, Bayesian Optimization, meta-model approaches, and scipy-based methods for black-box optimization tasks.
3
4
## Capabilities
5
6
### Core Optimizer Interface
7
8
The fundamental Optimizer base class that all optimization algorithms inherit from, providing a consistent interface for ask-tell optimization patterns.
9
10
```python { .api }
11
class Optimizer:
12
"""
13
Abstract base class for all optimization algorithms.
14
15
Parameters:
16
- parametrization: Parameter object defining search space
17
- budget: Maximum number of evaluations (int, optional)
18
- num_workers: Number of parallel workers (int, default=1)
19
"""
20
21
def __init__(self, parametrization: Parameter, budget: int = None, num_workers: int = 1):
22
"""Initialize optimizer with parametrization and budget."""
23
24
def ask(self) -> Parameter:
25
"""
26
Get next candidate for evaluation.
27
28
Returns:
29
Parameter candidate for function evaluation
30
"""
31
32
def tell(self, candidate: Parameter, loss: float) -> None:
33
"""
34
Provide evaluation result back to optimizer.
35
36
Args:
37
candidate: The parameter that was evaluated
38
loss: The function value (to be minimized)
39
"""
40
41
def provide_recommendation(self) -> Parameter:
42
"""
43
Get final recommendation after optimization.
44
45
Returns:
46
Best parameter found during optimization
47
"""
48
49
def minimize(self, function: Callable) -> Parameter:
50
"""
51
Complete optimization process.
52
53
Args:
54
function: Function to minimize
55
56
Returns:
57
Best parameter found
58
"""
59
60
def pareto_front(self) -> List[Parameter]:
61
"""
62
Get Pareto front for multi-objective optimization.
63
64
Returns:
65
List of Pareto-optimal parameters
66
"""
67
```
68
69
### Optimizer Registry
70
71
Centralized registry system for optimizer discovery and access, providing programmatic access to all available optimization algorithms.
72
73
```python { .api }
74
registry: Registry[OptCls]
75
"""
76
Registry containing all available optimizer classes.
77
78
Usage:
79
optimizer_class = ng.optimizers.registry["CMA"]
80
optimizer = optimizer_class(parametrization, budget=100)
81
"""
82
```
83
84
### Evolution Strategy Optimizers
85
86
Evolution Strategy algorithms including CMA-ES variants and (1+1) Evolution Strategy for continuous optimization with adaptive step sizes.
87
88
```python { .api }
89
class CMA(Optimizer):
90
"""Covariance Matrix Adaptation Evolution Strategy."""
91
92
class FCMA(Optimizer):
93
"""Fast CMA-ES variant."""
94
95
class ECMA(Optimizer):
96
"""Elitist CMA-ES variant."""
97
98
class DiagonalCMA(Optimizer):
99
"""Diagonal CMA-ES for high-dimensional problems."""
100
101
class OnePlusOne(Optimizer):
102
"""(1+1) Evolution Strategy."""
103
104
class DiscreteOnePlusOne(Optimizer):
105
"""Discrete (1+1) Evolution Strategy."""
106
107
class NoisyOnePlusOne(Optimizer):
108
"""Noisy (1+1) Evolution Strategy."""
109
110
class OptimisticDiscreteOnePlusOne(Optimizer):
111
"""Optimistic discrete (1+1) variant."""
112
```
113
114
### Differential Evolution Optimizers
115
116
Differential Evolution algorithms with various strategies and parameter settings for global optimization.
117
118
```python { .api }
119
class DE(Optimizer):
120
"""Standard Differential Evolution."""
121
122
class TwoPointsDE(Optimizer):
123
"""Two-point Differential Evolution."""
124
125
class LhsDE(Optimizer):
126
"""DE with Latin Hypercube Sampling initialization."""
127
128
class QrDE(Optimizer):
129
"""Quasi-random Differential Evolution."""
130
131
class MiniDE(Optimizer):
132
"""Minimal Differential Evolution."""
133
134
class MiniLhsDE(Optimizer):
135
"""Minimal DE with LHS initialization."""
136
137
class MiniQrDE(Optimizer):
138
"""Minimal quasi-random DE."""
139
```
140
141
### Particle Swarm Optimization
142
143
Particle Swarm Optimization algorithms with various topologies and parameter configurations for swarm-based optimization.
144
145
```python { .api }
146
class PSO(Optimizer):
147
"""Particle Swarm Optimization."""
148
149
class QOPSO(Optimizer):
150
"""Quasi-Oppositional PSO."""
151
152
class SQOPSO(Optimizer):
153
"""Simplified Quasi-Oppositional PSO."""
154
155
class SPSO(Optimizer):
156
"""Standard PSO."""
157
158
class RealSpacePSO(Optimizer):
159
"""Real-space PSO variant."""
160
```
161
162
### Sampling-Based Optimizers
163
164
Pure sampling methods and quasi-random sampling strategies for exploration-based optimization without learning.
165
166
```python { .api }
167
class RandomSearch(Optimizer):
168
"""Pure random sampling."""
169
170
class HaltonSearch(Optimizer):
171
"""Halton sequence quasi-random sampling."""
172
173
class HammersleySearch(Optimizer):
174
"""Hammersley sequence sampling."""
175
176
class LHSSearch(Optimizer):
177
"""Latin Hypercube Sampling."""
178
179
class QORandomSearch(Optimizer):
180
"""Quasi-oppositional random search."""
181
182
class ScrHammersleySearch(Optimizer):
183
"""Scrambled Hammersley search."""
184
185
class OrthogonalSamplingSearch(Optimizer):
186
"""Orthogonal sampling-based search."""
187
```
188
189
### Scipy-Based Optimizers
190
191
Wrappers for scipy.optimize algorithms providing access to classical optimization methods through the unified nevergrad interface.
192
193
```python { .api }
194
class BFGS(Optimizer):
195
"""Broyden-Fletcher-Goldfarb-Shanno quasi-Newton method."""
196
197
class LBFGSB(Optimizer):
198
"""Limited-memory BFGS with bounds."""
199
200
class Powell(Optimizer):
201
"""Powell's conjugate direction method."""
202
203
class NelderMead(Optimizer):
204
"""Nelder-Mead simplex algorithm."""
205
206
class COBYLA(Optimizer):
207
"""Constrained optimization by linear approximation."""
208
209
class SLSQP(Optimizer):
210
"""Sequential Least Squares Programming."""
211
212
class TrustRegionDL(Optimizer):
213
"""Trust region with dogleg method."""
214
```
215
216
### Meta-Model Optimizers
217
218
Surrogate model-based optimization using polynomial, neural network, SVM, and random forest metamodels for expensive function evaluations.
219
220
```python { .api }
221
class MetaModel(Optimizer):
222
"""Polynomial metamodel optimization."""
223
224
class NeuralMetaModel(Optimizer):
225
"""Neural network surrogate models."""
226
227
class SVMMetaModel(Optimizer):
228
"""Support Vector Machine metamodels."""
229
230
class RFMetaModel(Optimizer):
231
"""Random Forest metamodels."""
232
233
class MetaTuneRecentering(Optimizer):
234
"""Meta-model with recentering."""
235
236
class EvoMixDeterministic(Optimizer):
237
"""Evolutionary mixture with deterministic metamodel."""
238
```
239
240
### Bayesian Optimization
241
242
Bayesian optimization algorithms using Gaussian processes for sequential design and acquisition function optimization.
243
244
```python { .api }
245
class BO(Optimizer):
246
"""Basic Bayesian Optimization."""
247
248
class PCABO(Optimizer):
249
"""PCA-based Bayesian Optimization."""
250
251
class BayesOptimBO(Optimizer):
252
"""BayesOpt library integration."""
253
254
class UltraLowBudgetBO(Optimizer):
255
"""Bayesian optimization for very small budgets."""
256
257
class ParametrizedBO(Optimizer):
258
"""Configurable Bayesian Optimization."""
259
```
260
261
### Portfolio and Ensemble Optimizers
262
263
Multi-algorithm approaches that combine multiple optimization strategies or run algorithms in parallel for robust optimization.
264
265
```python { .api }
266
class NGOpt(Optimizer):
267
"""Automatic algorithm selection portfolio."""
268
269
class MultiCMA(Optimizer):
270
"""Multiple CMA-ES instances."""
271
272
class TripleCMA(Optimizer):
273
"""Three parallel CMA-ES instances."""
274
275
class Portfolio(Optimizer):
276
"""Basic portfolio optimizer."""
277
278
class ParaPortfolio(Optimizer):
279
"""Parallel portfolio execution."""
280
281
class ASCMADEthird(Optimizer):
282
"""Adaptive selection of CMA and DE."""
283
284
class ASCMADEQRthird(Optimizer):
285
"""Adaptive selection with quasi-random initialization."""
286
```
287
288
### Sequential and Chaining Optimizers
289
290
Multi-stage optimization approaches that combine different algorithms sequentially for improved performance.
291
292
```python { .api }
293
class ChainCMAPowell(Optimizer):
294
"""CMA-ES followed by Powell method."""
295
296
class ChainDEwithLHS(Optimizer):
297
"""DE with Latin Hypercube initialization."""
298
299
class ChainNaiveTBPSACMAPowell(Optimizer):
300
"""Sequential TBPSA, CMA, and Powell."""
301
302
class CMAL(Optimizer):
303
"""CMA-ES with local search refinement."""
304
305
class CMALarge(Optimizer):
306
"""CMA for large-scale problems with chaining."""
307
```
308
309
### Specialized and External Algorithm Integrations
310
311
Integration with external optimization libraries and specialized algorithms for specific problem domains.
312
313
```python { .api }
314
class HyperOpt(Optimizer):
315
"""Hyperopt library integration."""
316
317
class SMAC(Optimizer):
318
"""SMAC algorithm integration."""
319
320
class SMAC3(Optimizer):
321
"""SMAC3 algorithm integration."""
322
323
class AX(Optimizer):
324
"""Facebook Ax platform integration."""
325
326
class Optuna(Optimizer):
327
"""Optuna optimization framework integration."""
328
329
class PymooNSGA2(Optimizer):
330
"""NSGA-II multi-objective optimization."""
331
332
class PymooDEwithLHS(Optimizer):
333
"""Pymoo DE with Latin Hypercube Sampling."""
334
```
335
336
### Utility Functions
337
338
Helper functions for optimizer analysis, comparison, and learning from optimization results.
339
340
```python { .api }
341
def learn_on_k_best(archive, k: int, method: str = "polynomial") -> Callable:
342
"""
343
Meta-model learning on best candidates.
344
345
Args:
346
archive: Optimization archive with evaluated points
347
k: Number of best points to use for learning
348
method: Learning method ("polynomial", "neural", "svm", "rf")
349
350
Returns:
351
Learned model function
352
"""
353
354
def addCompare(name1: str, name2: str) -> None:
355
"""
356
Add optimizer comparison for benchmarking.
357
358
Args:
359
name1: First optimizer name
360
name2: Second optimizer name
361
"""
362
```
363
364
## Usage Examples
365
366
### Basic Optimization
367
368
```python
369
import nevergrad as ng
370
371
# Define function to minimize
372
def sphere(x):
373
return sum(x**2)
374
375
# Set up parametrization and optimizer
376
param = ng.p.Array(shape=(10,))
377
optimizer = ng.optimizers.CMA(parametrization=param, budget=100)
378
379
# Optimization loop
380
for _ in range(optimizer.budget):
381
x = optimizer.ask()
382
loss = sphere(x.value)
383
optimizer.tell(x, loss)
384
385
# Get result
386
recommendation = optimizer.provide_recommendation()
387
```
388
389
### Comparing Different Optimizers
390
391
```python
392
import nevergrad as ng
393
394
# Try different optimizers on the same problem
395
optimizers_to_test = [
396
ng.optimizers.CMA,
397
ng.optimizers.DE,
398
ng.optimizers.PSO,
399
ng.optimizers.NGOpt, # Automatic selection
400
]
401
402
param = ng.p.Array(shape=(5,))
403
budget = 100
404
405
results = {}
406
for optimizer_class in optimizers_to_test:
407
optimizer = optimizer_class(parametrization=param, budget=budget)
408
result = optimizer.minimize(sphere)
409
results[optimizer_class.__name__] = sphere(result.value)
410
411
print("Results:", results)
412
```
413
414
### Multi-objective Optimization
415
416
```python
417
def multi_objective_function(x):
418
# Return multiple objectives
419
obj1 = sum(x**2) # Minimize sphere
420
obj2 = sum((x - 1)**2) # Minimize distance from ones
421
return [obj1, obj2]
422
423
param = ng.p.Array(shape=(5,))
424
optimizer = ng.optimizers.CMA(parametrization=param, budget=100)
425
426
for _ in range(optimizer.budget):
427
x = optimizer.ask()
428
losses = multi_objective_function(x.value)
429
optimizer.tell(x, losses)
430
431
# Get Pareto front
432
pareto_front = optimizer.pareto_front()
433
```
434
435
### Using Registry for Dynamic Optimizer Selection
436
437
```python
438
# Get optimizer by name
439
optimizer_name = "CMA"
440
optimizer_class = ng.optimizers.registry[optimizer_name]
441
optimizer = optimizer_class(parametrization=param, budget=100)
442
443
# List all available optimizers
444
available_optimizers = list(ng.optimizers.registry.keys())
445
print(f"Available optimizers: {len(available_optimizers)}")
446
```