or run

npx @tessl/cli init
Log in

Version

Tile

Overview

Evals

Files

Files

docs

core-minimization.mdcost-functions.mdindex.mdscipy-interface.mdtesting.mdutilities.mdvisualization.md

testing.mddocs/

0

# Testing Functions

1

2

Common test functions for optimization algorithm benchmarking and validation. These functions provide standard optimization test cases with known minima for evaluating minimizer performance.

3

4

## Capabilities

5

6

### Classic Test Functions

7

8

Well-known optimization test functions with documented properties and global minima.

9

10

```python { .api }

11

def rosenbrock(x, y):

12

"""

13

Rosenbrock function. Minimum: f(1, 1) = 0.

14

15

The Rosenbrock function is a non-convex function used as a performance test

16

problem for optimization algorithms. It is introduced by Howard H. Rosenbrock

17

in 1960. Also known as Rosenbrock's valley or Rosenbrock's banana function.

18

19

Args:

20

x: First parameter (float)

21

y: Second parameter (float)

22

23

Returns:

24

float: Function value

25

26

Reference:

27

https://en.wikipedia.org/wiki/Rosenbrock_function

28

"""

29

30

def rosenbrock_grad(x, y):

31

"""

32

Gradient of Rosenbrock function.

33

34

Args:

35

x: First parameter (float)

36

y: Second parameter (float)

37

38

Returns:

39

Tuple[float, float]: Gradient components (df/dx, df/dy)

40

"""

41

42

def ackley(x, y):

43

"""

44

Ackley function. Minimum: f(0, 0) = 0.

45

46

The Ackley function is widely used for testing optimization algorithms.

47

It is characterized by a nearly flat outer region and a large hole at the center.

48

49

Args:

50

x: First parameter (float)

51

y: Second parameter (float)

52

53

Returns:

54

float: Function value

55

56

Reference:

57

https://en.wikipedia.org/wiki/Ackley_function

58

"""

59

60

def beale(x, y):

61

"""

62

Beale function. Minimum: f(3, 0.5) = 0.

63

64

The Beale function is multimodal, with sharp peaks at the corners of the

65

input domain.

66

67

Args:

68

x: First parameter (float)

69

y: Second parameter (float)

70

71

Returns:

72

float: Function value

73

74

Reference:

75

https://en.wikipedia.org/wiki/Test_functions_for_optimization

76

"""

77

78

def matyas(x, y):

79

"""

80

Matyas function. Minimum: f(0, 0) = 0.

81

82

The Matyas function has no local minima except the global one.

83

84

Args:

85

x: First parameter (float)

86

y: Second parameter (float)

87

88

Returns:

89

float: Function value

90

"""

91

```

92

93

### Multi-dimensional Functions

94

95

Test functions that work with N-dimensional parameter vectors.

96

97

```python { .api }

98

def sphere_np(x):

99

"""

100

N-dimensional sphere function. Minimum: f([0, 0, ..., 0]) = 0.

101

102

Simple convex quadratic function, often used as a basic test case.

103

104

Args:

105

x: Parameter vector (array-like)

106

107

Returns:

108

float: Sum of squares of all parameters

109

"""

110

```

111

112

### Additional Test Functions

113

114

Extended collection of optimization test functions for comprehensive benchmarking.

115

116

```python { .api }

117

def goldstein_price(x, y):

118

"""

119

Goldstein-Price function. Minimum: f(0, -1) = 3.

120

121

Args:

122

x: First parameter (float)

123

y: Second parameter (float)

124

125

Returns:

126

float: Function value

127

"""

128

129

def booth(x, y):

130

"""

131

Booth function. Minimum: f(1, 3) = 0.

132

133

Args:

134

x: First parameter (float)

135

y: Second parameter (float)

136

137

Returns:

138

float: Function value

139

"""

140

141

def himmelblau(x, y):

142

"""

143

Himmelblau's function. Has four identical local minima.

144

145

Minima at:

146

- f(3.0, 2.0) = 0

147

- f(-2.805118, 3.131312) = 0

148

- f(-3.779310, -3.283186) = 0

149

- f(3.584428, -1.848126) = 0

150

151

Args:

152

x: First parameter (float)

153

y: Second parameter (float)

154

155

Returns:

156

float: Function value

157

"""

158

```

159

160

## Usage Examples

161

162

### Basic Function Testing

163

164

```python

165

from iminuit import Minuit

166

from iminuit.testing import rosenbrock, rosenbrock_grad

167

168

# Test Rosenbrock function minimization

169

m = Minuit(rosenbrock, x=0, y=0)

170

m.migrad()

171

172

print(f"Minimum found at: x={m.values['x']:.6f}, y={m.values['y']:.6f}")

173

print(f"Function value: {m.fval:.6f}")

174

print(f"Expected minimum: (1, 1) with f=0")

175

```

176

177

### Using Gradients

178

179

```python

180

# Use analytical gradient for better convergence

181

m_with_grad = Minuit(rosenbrock, x=0, y=0, grad=rosenbrock_grad)

182

m_with_grad.migrad()

183

184

print(f"With gradient - Function calls: {m_with_grad.nfcn}")

185

print(f"Without gradient - Function calls: {m.nfcn}")

186

```

187

188

### Testing Multiple Functions

189

190

```python

191

from iminuit.testing import ackley, beale, matyas

192

193

test_functions = [

194

(rosenbrock, (0, 0), (1, 1), 0), # (function, start, expected_min, expected_val)

195

(ackley, (1, 1), (0, 0), 0),

196

(beale, (1, 1), (3, 0.5), 0),

197

(matyas, (1, 1), (0, 0), 0),

198

]

199

200

for func, start, expected_min, expected_val in test_functions:

201

m = Minuit(func, x=start[0], y=start[1])

202

m.migrad()

203

204

print(f"\n{func.__name__}:")

205

print(f" Found: ({m.values['x']:.3f}, {m.values['y']:.3f}), f={m.fval:.6f}")

206

print(f" Expected: {expected_min}, f={expected_val}")

207

print(f" Converged: {m.valid}, Calls: {m.nfcn}")

208

```

209

210

### Multi-dimensional Testing

211

212

```python

213

from iminuit.testing import sphere_np

214

import numpy as np

215

216

# Test N-dimensional sphere function

217

def sphere_5d(x1, x2, x3, x4, x5):

218

return sphere_np([x1, x2, x3, x4, x5])

219

220

# Start from random point

221

start = np.random.randn(5)

222

m = Minuit(sphere_5d, x1=start[0], x2=start[1], x3=start[2], x4=start[3], x5=start[4])

223

m.migrad()

224

225

print(f"5D sphere minimum: {list(m.values.values())}")

226

print(f"Function value: {m.fval:.6f}")

227

```

228

229

### Algorithm Comparison

230

231

```python

232

def test_algorithm_performance(func, start_point, methods=['migrad', 'simplex']):

233

"""Compare different minimization algorithms."""

234

results = {}

235

236

for method in methods:

237

m = Minuit(func, x=start_point[0], y=start_point[1])

238

239

if method == 'migrad':

240

m.migrad()

241

elif method == 'simplex':

242

m.simplex()

243

244

results[method] = {

245

'minimum': (m.values['x'], m.values['y']),

246

'fval': m.fval,

247

'nfcn': m.nfcn,

248

'valid': m.valid

249

}

250

251

return results

252

253

# Test different algorithms on Rosenbrock function

254

results = test_algorithm_performance(rosenbrock, (-1, -1))

255

for method, result in results.items():

256

print(f"{method}: {result}")

257

```

258

259

### Convergence Analysis

260

261

```python

262

def convergence_study(func, start_points, tolerance_levels):

263

"""Study convergence from different starting points."""

264

success_rate = {}

265

266

for tol in tolerance_levels:

267

successes = 0

268

total_calls = 0

269

270

for start in start_points:

271

m = Minuit(func, x=start[0], y=start[1])

272

m.tol = tol

273

m.migrad()

274

275

if m.valid:

276

successes += 1

277

total_calls += m.nfcn

278

279

success_rate[tol] = {

280

'rate': successes / len(start_points),

281

'avg_calls': total_calls / len(start_points)

282

}

283

284

return success_rate

285

286

# Study convergence for different tolerance levels

287

start_points = [(0, 0), (1, 1), (-1, -1), (2, -2), (-2, 2)]

288

tolerances = [1e-3, 1e-4, 1e-5, 1e-6]

289

290

results = convergence_study(rosenbrock, start_points, tolerances)

291

for tol, result in results.items():

292

print(f"Tolerance {tol}: Success rate {result['rate']:.1%}, "

293

f"Avg calls {result['avg_calls']:.1f}")

294

```

295

296

### Performance Benchmarking

297

298

```python

299

import time

300

301

def benchmark_function(func, start_point, n_runs=10):

302

"""Benchmark minimization performance."""

303

times = []

304

nfcn_list = []

305

306

for _ in range(n_runs):

307

m = Minuit(func, x=start_point[0], y=start_point[1])

308

309

start_time = time.time()

310

m.migrad()

311

end_time = time.time()

312

313

times.append(end_time - start_time)

314

nfcn_list.append(m.nfcn)

315

316

return {

317

'avg_time': np.mean(times),

318

'std_time': np.std(times),

319

'avg_nfcn': np.mean(nfcn_list),

320

'std_nfcn': np.std(nfcn_list)

321

}

322

323

# Benchmark different test functions

324

functions = [rosenbrock, ackley, beale]

325

for func in functions:

326

result = benchmark_function(func, (0.5, 0.5))

327

print(f"{func.__name__}: {result['avg_time']:.4f}±{result['std_time']:.4f}s, "

328

f"{result['avg_nfcn']:.1f}±{result['std_nfcn']:.1f} calls")

329

```

330

331

### Custom Test Function

332

333

```python

334

def create_shifted_quadratic(center, scale):

335

"""Create a shifted and scaled quadratic function for testing."""

336

def shifted_quadratic(x, y):

337

return scale * ((x - center[0])**2 + (y - center[1])**2)

338

339

# Add metadata for testing

340

shifted_quadratic.minimum = center

341

shifted_quadratic.minimum_value = 0.0

342

shifted_quadratic.__name__ = f"shifted_quadratic_{center}_{scale}"

343

344

return shifted_quadratic

345

346

# Create and test custom function

347

custom_func = create_shifted_quadratic((2, -1), 0.5)

348

m = Minuit(custom_func, x=0, y=0)

349

m.migrad()

350

351

print(f"Custom function minimum: ({m.values['x']:.3f}, {m.values['y']:.3f})")

352

print(f"Expected: {custom_func.minimum}")

353

```

354

355

## Test Function Properties

356

357

| Function | Global Minimum | Function Value | Characteristics |

358

|----------|----------------|----------------|-----------------|

359

| Rosenbrock | (1, 1) | 0 | Non-convex, narrow curved valley |

360

| Ackley | (0, 0) | 0 | Many local minima, flat outer region |

361

| Beale | (3, 0.5) | 0 | Multimodal, sharp peaks |

362

| Matyas | (0, 0) | 0 | No local minima |

363

| Sphere | (0, ..., 0) | 0 | Convex, simple quadratic |

364

| Goldstein-Price | (0, -1) | 3 | Multiple local minima |

365

| Booth | (1, 3) | 0 | Simple quadratic with cross-term |

366

| Himmelblau | Multiple | 0 | Four identical global minima |

367

368

These functions provide a diverse set of optimization challenges for testing minimizer robustness, convergence speed, and accuracy.