0
# Minimization and Optimization
1
2
LMFIT provides comprehensive minimization capabilities supporting 15+ optimization methods from scipy.optimize, enhanced with Parameter objects, improved error estimation, and MCMC sampling. The Minimizer class offers fine-grained control while the minimize function provides a convenient high-level interface.
3
4
## Capabilities
5
6
### Minimizer Class
7
8
Core optimization engine providing detailed control over the fitting process.
9
10
```python { .api }
11
class Minimizer:
12
"""General minimizer for curve fitting and optimization"""
13
14
def __init__(self, userfcn, params, fcn_args=None, fcn_kws=None,
15
iter_cb=None, scale_covar=True, nan_policy='raise',
16
reduce_fcn=None, calc_covar=True, max_nfev=None, **kws):
17
"""
18
Create a Minimizer instance.
19
20
Args:
21
userfcn: Objective function to minimize
22
params (Parameters): Parameters for the fit
23
fcn_args (tuple): Additional arguments for objective function
24
fcn_kws (dict): Additional keyword arguments for objective function
25
iter_cb: Callback function called at each iteration
26
scale_covar (bool): Scale covariance matrix for uncertainty estimation
27
nan_policy (str): How to handle NaN values ('raise', 'propagate', 'omit')
28
reduce_fcn: Function to reduce residual array to scalar
29
calc_covar (bool): Calculate covariance matrix
30
max_nfev (int): Maximum number of function evaluations
31
"""
32
33
def minimize(self, method='leastsq', params=None, **kws):
34
"""
35
Perform minimization.
36
37
Args:
38
method (str): Optimization method
39
params (Parameters): Parameters to use for fit
40
**kws: Method-specific keyword arguments
41
42
Returns:
43
MinimizerResult: Fit results and statistics
44
"""
45
46
def scalar_minimize(self, method='Nelder-Mead', params=None, **kws):
47
"""
48
Use scalar minimization methods from scipy.optimize.
49
50
Args:
51
method (str): Scipy scalar method name
52
params (Parameters): Parameters for fit
53
**kws: Method-specific arguments
54
55
Returns:
56
MinimizerResult: Fit results
57
"""
58
59
def emcee(self, params=None, steps=1000, nwalkers=100, burn=0, thin=1,
60
ntemps=1, **kws):
61
"""
62
Markov Chain Monte Carlo sampling using emcee.
63
64
Args:
65
params (Parameters): Parameters for sampling
66
steps (int): Number of MCMC steps
67
nwalkers (int): Number of walkers
68
burn (int): Number of burn-in steps
69
thin (int): Thinning factor for chain
70
ntemps (int): Number of temperatures for parallel tempering
71
**kws: Additional emcee arguments
72
73
Returns:
74
MinimizerResult: MCMC results with chain data
75
"""
76
77
def prepare_fit(self, params=None):
78
"""
79
Prepare parameters and arrays for fitting.
80
81
Args:
82
params (Parameters): Parameters for fit
83
"""
84
85
def unprepare_fit(self):
86
"""Clean up after fitting"""
87
```
88
89
### MinimizerResult Class
90
91
Container for minimization results with comprehensive fit statistics and parameter information.
92
93
```python { .api }
94
class MinimizerResult:
95
"""Results from minimization with fit statistics and parameter data"""
96
97
def show_candidates(self, n_candidates=5, precision=3):
98
"""
99
Display candidate solutions for ambiguous fits.
100
101
Args:
102
n_candidates (int): Number of candidates to show
103
precision (int): Decimal precision for display
104
"""
105
106
# Key attributes available after fitting:
107
# success (bool): Whether optimization succeeded
108
# message (str): Termination message from optimizer
109
# method (str): Optimization method used
110
# nfev (int): Number of function evaluations
111
# ndata (int): Number of data points
112
# nvarys (int): Number of varied parameters
113
# nfree (int): Degrees of freedom (ndata - nvarys)
114
# chisqr (float): Chi-squared statistic
115
# redchi (float): Reduced chi-squared (chisqr / nfree)
116
# aic (float): Akaike Information Criterion
117
# bic (float): Bayesian Information Criterion
118
# params (Parameters): Best-fit parameters with uncertainties
119
# var_names (list): Names of varied parameters
120
# covar (ndarray): Covariance matrix
121
# best_values (dict): Best-fit parameter values
122
# init_values (dict): Initial parameter values
123
# residual (ndarray): Residual array at best fit
124
# flatchain (ndarray): Flattened MCMC chain (for emcee results)
125
```
126
127
### Standalone Minimize Function
128
129
High-level interface for minimization without creating a Minimizer instance.
130
131
```python { .api }
132
def minimize(fcn, params, method='leastsq', args=None, kws=None,
133
scale_covar=True, iter_cb=None, reduce_fcn=None,
134
nan_policy='raise', calc_covar=True, max_nfev=None, **fit_kws):
135
"""
136
Minimize objective function using specified method.
137
138
Args:
139
fcn: Objective function to minimize
140
params (Parameters): Parameters for the fit
141
method (str): Optimization method to use
142
args (tuple): Additional arguments for objective function
143
kws (dict): Additional keyword arguments for objective function
144
scale_covar (bool): Scale covariance matrix
145
iter_cb: Iteration callback function
146
reduce_fcn: Function to convert residual array to scalar
147
nan_policy (str): Policy for handling NaN values
148
calc_covar (bool): Calculate covariance matrix
149
max_nfev (int): Maximum function evaluations
150
**fit_kws: Method-specific keyword arguments
151
152
Returns:
153
MinimizerResult: Optimization results
154
"""
155
```
156
157
### Supported Optimization Methods
158
159
**Least-Squares Methods:**
160
- `'leastsq'`: Levenberg-Marquardt (default, best for most problems)
161
- `'least_squares'`: Trust Region Reflective with bounds support
162
163
**Global Optimization Methods:**
164
- `'differential_evolution'`: Stochastic global optimizer
165
- `'brute'`: Brute force over parameter grid
166
- `'basinhopping'`: Basin-hopping global optimizer
167
- `'ampgo'`: Adaptive Memory Programming for Global Optimization
168
- `'shgo'`: Simplicial Homology Global Optimization
169
- `'dual_annealing'`: Dual annealing global optimizer
170
171
**Local Scalar Methods:**
172
- `'nelder'`: Nelder-Mead simplex
173
- `'powell'`: Powell's method
174
- `'cg'`: Conjugate Gradient
175
- `'newton'`: Newton-CG
176
- `'l-bfgs-b'`: L-BFGS-B with bounds
177
- `'tnc'`: Truncated Newton with bounds
178
- `'cobyla'`: Constrained Optimization BY Linear Approximation
179
- `'slsqp'`: Sequential Least Squares Programming
180
- `'bfgs'`: BFGS quasi-Newton
181
182
**Sampling Methods:**
183
- `'emcee'`: Markov Chain Monte Carlo (requires emcee package)
184
185
## Usage Examples
186
187
### Basic Minimization
188
189
```python
190
import numpy as np
191
from lmfit import minimize, Parameters
192
193
def objective(params, x, data):
194
"""Objective function: residual between model and data"""
195
a = params['a']
196
b = params['b']
197
c = params['c']
198
model = a * np.exp(-b * x) + c
199
return model - data
200
201
# Create sample data
202
x = np.linspace(0, 15, 301)
203
data = 5.0 * np.exp(-0.5 * x) + np.random.normal(size=301, scale=0.2) + 2.0
204
205
# Set up parameters
206
params = Parameters()
207
params.add('a', value=10, min=0)
208
params.add('b', value=1, min=0)
209
params.add('c', value=2)
210
211
# Perform fit
212
result = minimize(objective, params, args=(x, data))
213
print(f"Chi-squared: {result.chisqr:.3f}")
214
print(f"Reduced chi-squared: {result.redchi:.3f}")
215
```
216
217
### Using Different Methods
218
219
```python
220
# Try different optimization methods
221
methods = ['leastsq', 'least_squares', 'differential_evolution', 'nelder']
222
223
for method in methods:
224
result = minimize(objective, params, method=method, args=(x, data))
225
print(f"{method}: chi-squared = {result.chisqr:.3f}")
226
```
227
228
### MCMC Sampling
229
230
```python
231
# Perform MCMC sampling (requires emcee)
232
result_mcmc = minimize(objective, params, method='emcee',
233
args=(x, data), steps=2000, nwalkers=50, burn=500)
234
235
# Access MCMC chain
236
chain = result_mcmc.flatchain
237
means = np.mean(chain, axis=0)
238
stds = np.std(chain, axis=0)
239
print("MCMC parameter means:", means)
240
print("MCMC parameter stds:", stds)
241
```
242
243
### Advanced Minimizer Usage
244
245
```python
246
from lmfit import Minimizer
247
248
def iter_callback(params, iteration, residual):
249
"""Callback function called at each iteration"""
250
print(f"Iteration {iteration}: chi-squared = {np.sum(residual**2):.3f}")
251
252
# Create minimizer with detailed control
253
minimizer = Minimizer(objective, params, fcn_args=(x, data),
254
iter_cb=iter_callback, calc_covar=True)
255
256
# Fit with specific method and options
257
result = minimizer.minimize(method='leastsq', xtol=1e-8, ftol=1e-8)
258
259
# Access detailed results
260
print(f"Function evaluations: {result.nfev}")
261
print(f"Success: {result.success}")
262
print(f"Message: {result.message}")
263
```
264
265
### Global Optimization
266
267
```python
268
# Global optimization for difficult problems
269
result_global = minimize(objective, params, method='differential_evolution',
270
args=(x, data), seed=123, maxiter=1000)
271
272
# Basin-hopping for multiple local minima
273
result_basin = minimize(objective, params, method='basinhopping',
274
args=(x, data), niter=100, T=1.0)
275
```