or run

npx @tessl/cli init
Log in

Version

Tile

Overview

Evals

Files

Files

docs

benchmark.mdcallbacks.mdindex.mdops.mdoptimizer-families.mdoptimizers.mdparametrization.mdtypes-and-errors.md

index.mddocs/

0

# Nevergrad

1

2

A comprehensive Python toolbox for performing gradient-free optimization. Nevergrad provides a unified interface for black-box optimization with extensive support for parameter types, optimization algorithms, and monitoring capabilities. It handles complex parametrization including continuous, discrete, and mixed variables with advanced features like log-distributed parameters, categorical choices, and hierarchical parameter structures.

3

4

## Package Information

5

6

- **Package Name**: nevergrad

7

- **Language**: Python

8

- **Installation**: `pip install nevergrad`

9

10

## Core Imports

11

12

```python

13

import nevergrad as ng

14

```

15

16

Common usage patterns:

17

18

```python

19

# Parametrization

20

import nevergrad as ng

21

param = ng.p.Array(shape=(10,))

22

23

# Optimization

24

optimizer = ng.optimizers.CMA(parametrization=param, budget=100)

25

26

# Operations (constraints, mutations, integer casting)

27

from nevergrad import ops

28

29

# Functions and benchmarking

30

from nevergrad.functions import ArtificialFunction

31

from nevergrad.benchmark import Experiment

32

33

# Error handling

34

from nevergrad import errors

35

```

36

37

## Basic Usage

38

39

```python

40

import nevergrad as ng

41

import numpy as np

42

43

# Define the function to optimize (minimize)

44

def sphere(x):

45

return sum(x**2)

46

47

# Create parametrization - array of 10 floats

48

parametrization = ng.p.Array(shape=(10,))

49

50

# Choose optimizer and budget

51

optimizer = ng.optimizers.CMA(parametrization=parametrization, budget=100)

52

53

# Optimization loop

54

for _ in range(optimizer.budget):

55

x = optimizer.ask() # Get candidate

56

loss = sphere(x.value) # Evaluate function

57

optimizer.tell(x, loss) # Tell optimizer the result

58

59

# Get recommendation

60

recommendation = optimizer.provide_recommendation()

61

print(f"Best point: {recommendation.value}")

62

print(f"Best loss: {sphere(recommendation.value)}")

63

```

64

65

## Architecture

66

67

Nevergrad follows a layered architecture with clear separation of concerns:

68

69

- **Parametrization Layer**: Handles parameter types, transformations, and mutations through the `ng.p` module

70

- **Optimization Layer**: Provides algorithm implementations and optimization logic through `ng.optimizers`

71

- **Configuration Layer**: Enables optimizer customization through parametrizable families in `ng.families`

72

- **Monitoring Layer**: Supports optimization tracking and logging through `ng.callbacks`

73

- **Operations Layer**: Provides constraints, mutation operators, and parameter transformations through `ng.ops`

74

- **Type System**: Ensures type safety and provides clear interfaces through `ng.typing`

75

- **Error Handling**: Provides comprehensive error reporting through `ng.errors`

76

77

This design enables maximum flexibility and extensibility for machine learning hyperparameter optimization, neural architecture search, automated algorithm configuration, and general scientific optimization tasks.

78

79

## Capabilities

80

81

### Parametrization System

82

83

Comprehensive parameter handling supporting scalar values, arrays, discrete choices, hierarchical structures, and constraints. Includes data transformations, bounds handling, mutation strategies, and constraint satisfaction.

84

85

```python { .api }

86

# Core parameter types

87

class Parameter:

88

value: Any

89

dimension: int

90

def mutate(self) -> None: ...

91

def sample(self) -> 'Parameter': ...

92

93

class Array(Parameter): ...

94

class Scalar(Parameter): ...

95

class Choice(Parameter): ...

96

class Dict(Parameter): ...

97

class Tuple(Parameter): ...

98

```

99

100

[Parametrization](./parametrization.md)

101

102

### Optimization Algorithms

103

104

368+ registered optimization algorithms including Evolution Strategies, Differential Evolution, Particle Swarm Optimization, Bayesian Optimization, meta-model approaches, and scipy-based methods. Provides unified interface for all algorithms.

105

106

```python { .api }

107

class Optimizer:

108

def __init__(self, parametrization, budget=None, num_workers=1): ...

109

def ask(self) -> Parameter: ...

110

def tell(self, candidate: Parameter, loss: float) -> None: ...

111

def minimize(self, function) -> Parameter: ...

112

def provide_recommendation(self) -> Parameter: ...

113

```

114

115

[Optimizers](./optimizers.md)

116

117

### Optimizer Families

118

119

Parametrizable optimizer configurations enabling algorithm customization and automated hyperparameter tuning. Provides factory patterns for creating specialized optimizer variants.

120

121

```python { .api }

122

class ParametrizedCMA: ...

123

class ParametrizedBO: ...

124

class DifferentialEvolution: ...

125

class Chaining: ...

126

```

127

128

[Optimizer Families](./optimizer-families.md)

129

130

### Monitoring and Callbacks

131

132

Comprehensive callback system for optimization monitoring, logging, progress tracking, early stopping, and state persistence during optimization runs.

133

134

```python { .api }

135

class OptimizationPrinter: ...

136

class OptimizationLogger: ...

137

class ProgressBar: ...

138

class EarlyStopping: ...

139

```

140

141

[Callbacks](./callbacks.md)

142

143

### Type System and Error Handling

144

145

Rich type system with protocol definitions and comprehensive error handling for robust optimization workflows.

146

147

```python { .api }

148

# Key types

149

ArrayLike = Union[Tuple[float, ...], List[float], np.ndarray]

150

Loss = Union[float, ArrayLike]

151

152

# Error handling

153

class NevergradError(Exception): ...

154

class NevergradRuntimeError(NevergradError): ...

155

```

156

157

[Types and Errors](./types-and-errors.md)

158

159

### Operations and Transformations

160

161

Specialized parameter operations including constraint handling, mutation operators for evolutionary algorithms, and parameter transformations for discrete optimization.

162

163

```python { .api }

164

class Constraint:

165

def __init__(self, func, optimizer="NGOpt", budget=100): ...

166

def __call__(self, parameter): ...

167

168

class Mutation:

169

def __call__(self, parameter, inplace=False): ...

170

171

class Crossover(Mutation): ...

172

class Translation(Mutation): ...

173

174

def Int(deterministic=True): ...

175

```

176

177

[Operations](./ops.md)

178

179

### Functions and Benchmarking

180

181

Comprehensive benchmarking framework with 37+ artificial test functions and systematic experiment management for optimizer evaluation and comparison.

182

183

```python { .api }

184

class ArtificialFunction:

185

def __init__(self, name, block_dimension, **config): ...

186

def __call__(self, x) -> float: ...

187

188

class Experiment:

189

def __init__(self, function, optimizer, budget, **params): ...

190

def run(self) -> Parameter: ...

191

192

class ExperimentFunction:

193

def __init__(self, function, parametrization): ...

194

```

195

196

[Functions and Benchmarking](./benchmark.md)