0
# SciPy Interface
1
2
SciPy-compatible minimization interface that provides the same API as `scipy.optimize.minimize`. This allows easy integration with existing SciPy workflows while accessing MINUIT's robust minimization algorithms.
3
4
## Capabilities
5
6
### Minimize Function
7
8
Main interface function compatible with `scipy.optimize.minimize` API.
9
10
```python { .api }
11
def minimize(fun, x0, args=(), method="migrad", jac=None, hess=None, hessp=None,
12
bounds=None, constraints=None, tol=None, callback=None, options=None):
13
"""
14
Interface to MIGRAD using the scipy.optimize.minimize API.
15
16
This function provides the same interface as scipy.optimize.minimize. If you are
17
familiar with the latter, this allows you to use Minuit with a quick start.
18
19
Args:
20
fun: Objective function to minimize
21
Signature: fun(x, *args) -> float
22
x0: Initial parameter values (array-like)
23
args: Extra arguments passed to objective function (tuple, optional)
24
method: Minimization method ("migrad" or "simplex", default: "migrad")
25
jac: Gradient function (callable, bool, or None)
26
If callable: jac(x, *args) -> array
27
If True: fun returns (f, g) where g is gradient
28
If None: numerical gradient used
29
hess: Hessian function (ignored, for compatibility)
30
hessp: Hessian-vector product (ignored, for compatibility)
31
bounds: Parameter bounds (sequence of (min, max) tuples or None)
32
constraints: Constraints (ignored, for compatibility)
33
tol: Tolerance for convergence (float, optional)
34
callback: Callback function called after each iteration (ignored)
35
options: Dictionary of solver options (optional)
36
37
Returns:
38
OptimizeResult: Result object with attributes:
39
x: Final parameter values (ndarray)
40
fun: Final function value (float)
41
success: Whether optimization succeeded (bool)
42
status: Termination status (int)
43
message: Termination message (str)
44
nfev: Number of function evaluations (int)
45
nit: Number of iterations (int)
46
hess_inv: Inverse Hessian approximation (2D array, if available)
47
"""
48
```
49
50
### Options Dictionary
51
52
Supported options for controlling minimization behavior.
53
54
```python { .api }
55
# Options dictionary keys (all optional):
56
options = {
57
"disp": bool, # Set to True to print convergence messages (default: False)
58
"stra": int, # Minuit strategy (0: fast, 1: balanced, 2: accurate, default: 1)
59
"maxfun": int, # Maximum allowed number of function evaluations (default: None)
60
"maxfev": int, # Deprecated alias for maxfun
61
"eps": float, # Initial step size for numerical derivative (default: 1)
62
}
63
```
64
65
## Usage Examples
66
67
### Basic Usage
68
69
```python
70
from iminuit import minimize
71
import numpy as np
72
73
# Define objective function
74
def rosenbrock(x):
75
return (1 - x[0])**2 + 100 * (x[1] - x[0]**2)**2
76
77
# Minimize using MIGRAD
78
result = minimize(rosenbrock, x0=[0, 0])
79
80
print(f"Success: {result.success}")
81
print(f"Minimum at: {result.x}")
82
print(f"Function value: {result.fun}")
83
print(f"Function evaluations: {result.nfev}")
84
```
85
86
### With Bounds
87
88
```python
89
# Define bounds for parameters
90
bounds = [(0, 2), (-1, 3)] # x[0] in [0, 2], x[1] in [-1, 3]
91
92
result = minimize(rosenbrock, x0=[0.5, 0.5], bounds=bounds)
93
print(f"Bounded minimum: {result.x}")
94
```
95
96
### With Gradient
97
98
```python
99
def rosenbrock_with_grad(x):
100
f = (1 - x[0])**2 + 100 * (x[1] - x[0]**2)**2
101
g = np.array([
102
-2 * (1 - x[0]) - 400 * x[0] * (x[1] - x[0]**2),
103
200 * (x[1] - x[0]**2)
104
])
105
return f, g
106
107
# Use gradient information
108
result = minimize(rosenbrock_with_grad, x0=[0, 0], jac=True)
109
print(f"With gradient - minimum at: {result.x}")
110
```
111
112
### Alternative: Separate Gradient Function
113
114
```python
115
def rosenbrock_grad(x):
116
return np.array([
117
-2 * (1 - x[0]) - 400 * x[0] * (x[1] - x[0]**2),
118
200 * (x[1] - x[0]**2)
119
])
120
121
result = minimize(rosenbrock, x0=[0, 0], jac=rosenbrock_grad)
122
```
123
124
### With Options
125
126
```python
127
# Set minimization options
128
options = {
129
"disp": True, # Print convergence messages
130
"stra": 2, # Use accurate strategy
131
"maxfun": 10000, # Maximum function evaluations
132
"eps": 0.1 # Initial step size
133
}
134
135
result = minimize(rosenbrock, x0=[0, 0], options=options)
136
```
137
138
### Using Simplex Method
139
140
```python
141
# Use Simplex instead of MIGRAD
142
result = minimize(rosenbrock, x0=[0, 0], method="simplex")
143
print(f"Simplex result: {result.x}")
144
```
145
146
### With Extra Arguments
147
148
```python
149
def quadratic(x, a, b):
150
return a * (x[0] - 1)**2 + b * (x[1] - 2)**2
151
152
# Pass extra arguments to function
153
result = minimize(quadratic, x0=[0, 0], args=(2, 3))
154
print(f"Minimum with args: {result.x}")
155
```
156
157
### Comparing with SciPy
158
159
```python
160
from scipy.optimize import minimize as scipy_minimize
161
from iminuit import minimize as iminuit_minimize
162
163
# Same function, different optimizers
164
result_scipy = scipy_minimize(rosenbrock, x0=[0, 0], method='BFGS')
165
result_iminuit = iminuit_minimize(rosenbrock, x0=[0, 0])
166
167
print(f"SciPy result: {result_scipy.x}, feval: {result_scipy.nfev}")
168
print(f"iminuit result: {result_iminuit.x}, feval: {result_iminuit.nfev}")
169
```
170
171
### Error Handling
172
173
```python
174
def problematic_function(x):
175
if x[0] < 0:
176
return np.inf # Return inf for invalid regions
177
return (x[0] - 1)**2 + (x[1] - 2)**2
178
179
result = minimize(problematic_function, x0=[-1, 0])
180
if not result.success:
181
print(f"Optimization failed: {result.message}")
182
else:
183
print(f"Success: {result.x}")
184
```
185
186
### Integration with Existing SciPy Code
187
188
```python
189
# Drop-in replacement for scipy.optimize.minimize
190
def my_optimization_routine(objective, initial_guess):
191
# This function can work with either scipy or iminuit
192
result = minimize(objective, initial_guess, method="migrad")
193
return result.x, result.fun
194
195
# Usage
196
best_params, best_value = my_optimization_routine(rosenbrock, [0, 0])
197
```
198
199
## Return Object Details
200
201
The `OptimizeResult` object returned by `minimize` contains:
202
203
```python { .api }
204
class OptimizeResult:
205
"""Result of minimization."""
206
207
x: np.ndarray # Final parameter values
208
fun: float # Final function value
209
success: bool # Whether optimization succeeded
210
status: int # Termination status code
211
message: str # Termination message
212
nfev: int # Number of function evaluations
213
nit: int # Number of iterations
214
hess_inv: np.ndarray # Inverse Hessian approximation (if available)
215
```
216
217
## Compatibility Notes
218
219
- **Supported features**: Basic minimization, bounds, gradients, options
220
- **Unsupported features**: `hess`, `hessp`, `constraints`, `callback`
221
- **Method options**: Only "migrad" and "simplex" are supported
222
- **Bounds format**: Sequence of (min, max) tuples, None for unbounded
223
- **Gradient format**: Compatible with SciPy jac parameter conventions
224
225
The interface is designed for easy migration from SciPy-based code while providing access to MINUIT's robust minimization algorithms.