or run

npx @tessl/cli init
Log in

Version

Tile

Overview

Evals

Files

Files

docs

data-handling.mddistributions.mdgaussian-processes.mdglm.mdindex.mdmath-functions.mdmodeling.mdsampling.mdstats-plots.mdstep-methods.mdvariational.md

step-methods.mddocs/

0

# MCMC Step Methods

1

2

PyMC3 provides a comprehensive suite of MCMC step methods for sampling from posterior distributions. Step methods define how the sampler moves through parameter space, with automatic step method assignment based on variable types and model structure.

3

4

## Capabilities

5

6

### Hamiltonian Monte Carlo Methods

7

8

Advanced gradient-based samplers that use gradient information to efficiently explore high-dimensional parameter spaces.

9

10

```python { .api }

11

class NUTS:

12

"""

13

No-U-Turn Sampler - adaptive Hamiltonian Monte Carlo variant.

14

15

Parameters:

16

- vars: list, variables to sample (all continuous vars if None)

17

- target_accept: float, target acceptance probability (default 0.8)

18

- max_treedepth: int, maximum tree depth (default 10)

19

- step_scale: float, initial step size scaling

20

- is_cov: bool, treat step_scale as covariance matrix

21

- model: Model, model to sample from

22

- **kwargs: additional sampler arguments

23

"""

24

25

class HamiltonianMC:

26

"""

27

Hamiltonian Monte Carlo sampler with fixed step size and path length.

28

29

Parameters:

30

- vars: list, variables to sample (all continuous vars if None)

31

- path_length: float, length of Hamiltonian trajectory

32

- step_rand: function, step size randomization function

33

- step_scale: float, step size scaling

34

- is_cov: bool, treat step_scale as covariance matrix

35

- model: Model, model to sample from

36

- **kwargs: additional sampler arguments

37

"""

38

```

39

40

### Metropolis Methods

41

42

Random-walk Metropolis samplers with various proposal distributions for different variable types.

43

44

```python { .api }

45

class Metropolis:

46

"""

47

General Metropolis-Hastings sampler with configurable proposals.

48

49

Parameters:

50

- vars: list, variables to sample (all vars if None)

51

- S: array or matrix, proposal covariance or scaling

52

- proposal_dist: function, proposal distribution

53

- scaling: float, proposal scaling factor

54

- tune: bool, automatically tune proposal during sampling

55

- tune_interval: int, tuning interval in samples

56

- model: Model, model to sample from

57

"""

58

59

class BinaryMetropolis:

60

"""

61

Metropolis sampler for binary variables using bit flipping.

62

63

Parameters:

64

- vars: list, binary variables to sample

65

- scaling: float, probability of proposing a flip

66

- tune: bool, automatically tune scaling

67

- model: Model, model to sample from

68

"""

69

70

class BinaryGibbsMetropolis:

71

"""

72

Gibbs sampler for binary variables using conditional distributions.

73

74

Parameters:

75

- vars: list, binary variables to sample

76

- order: str, variable update order ('random' or 'fixed')

77

- model: Model, model to sample from

78

"""

79

80

class CategoricalGibbsMetropolis:

81

"""

82

Gibbs sampler for categorical variables.

83

84

Parameters:

85

- vars: list, categorical variables to sample

86

- model: Model, model to sample from

87

"""

88

89

class DEMetropolis:

90

"""

91

Differential Evolution Metropolis for efficient parallel sampling.

92

93

Parameters:

94

- vars: list, variables to sample

95

- lamb: float, differential evolution parameter

96

- tune: str, tuning method ('lambda' or 'scaling')

97

- tune_interval: int, tuning interval

98

- model: Model, model to sample from

99

"""

100

101

class DEMetropolisZ:

102

"""

103

DE-MCMC-Z sampler using past chain history for proposals.

104

105

Parameters:

106

- vars: list, variables to sample

107

- lamb: float, differential evolution parameter

108

- tune: str, tuning method

109

- tune_interval: int, tuning interval

110

- model: Model, model to sample from

111

"""

112

```

113

114

### Proposal Distributions

115

116

Configurable proposal distributions for Metropolis samplers.

117

118

```python { .api }

119

class NormalProposal:

120

"""Normal proposal distribution for continuous variables."""

121

122

class CauchyProposal:

123

"""Cauchy proposal distribution with heavy tails."""

124

125

class LaplaceProposal:

126

"""Laplace proposal distribution."""

127

128

class PoissonProposal:

129

"""Poisson proposal distribution for count data."""

130

131

class UniformProposal:

132

"""Uniform proposal distribution."""

133

134

class MultivariateNormalProposal:

135

"""Multivariate normal proposal for correlated variables."""

136

```

137

138

### Specialized Samplers

139

140

Advanced samplers for specific model types and sampling scenarios.

141

142

```python { .api }

143

class Slice:

144

"""

145

Slice sampler for univariate continuous variables.

146

147

Parameters:

148

- vars: list, variables to sample

149

- w: float or array, initial bracket width

150

- tune: bool, automatically tune bracket width

151

- model: Model, model to sample from

152

"""

153

154

class EllipticalSlice:

155

"""

156

Elliptical slice sampler for variables with Gaussian priors.

157

158

Parameters:

159

- vars: list, variables with Gaussian priors

160

- prior_cov: array, prior covariance matrix

161

- model: Model, model to sample from

162

"""

163

164

class ElemwiseCategorical:

165

"""

166

Element-wise Gibbs sampler for categorical variables.

167

168

Parameters:

169

- vars: list, categorical variables to sample

170

- values: list, possible values for each variable

171

- model: Model, model to sample from

172

"""

173

174

class PGBART:

175

"""

176

Particle Gibbs sampler for Bayesian Additive Regression Trees.

177

178

Parameters:

179

- vars: list, BART variables to sample

180

- num_particles: int, number of particles

181

- batch: bool, use batch updates

182

- model: Model, model to sample from

183

"""

184

185

class CompoundStep:

186

"""

187

Compound step method that combines multiple step methods.

188

189

Parameters:

190

- methods: list, step methods to combine

191

- model: Model, model to sample from

192

"""

193

```

194

195

### Multi-Level Samplers

196

197

Specialized samplers for hierarchical and multi-level models.

198

199

```python { .api }

200

class MLDA:

201

"""

202

Multi-Level Delayed Acceptance sampler for hierarchical models.

203

204

Parameters:

205

- coarse_models: list, coarse approximation models

206

- base_sampler: step method for fine level

207

- base_scaling: float, scaling for base sampler

208

- model: Model, fine-level model to sample from

209

"""

210

211

class MetropolisMLDA:

212

"""MLDA with Metropolis base sampler."""

213

214

class DEMetropolisZMLDA:

215

"""MLDA with DE-MCMC-Z base sampler."""

216

217

class RecursiveDAProposal:

218

"""

219

Recursive delayed acceptance proposal for MLDA.

220

221

Parameters:

222

- coarse_models: list, hierarchy of coarse models

223

- base_proposal: proposal for finest level

224

"""

225

```

226

227

## Usage Examples

228

229

### Basic Step Method Assignment

230

231

```python

232

import pymc3 as pm

233

234

with pm.Model() as model:

235

# Define model variables

236

mu = pm.Normal('mu', 0, 1)

237

sigma = pm.HalfNormal('sigma', 1)

238

y = pm.Normal('y', mu, sigma, observed=data)

239

240

# Automatic step method assignment

241

trace = pm.sample(1000) # NUTS for continuous vars

242

243

# Manual step method specification

244

step = pm.NUTS([mu, sigma])

245

trace = pm.sample(1000, step=step)

246

```

247

248

### Multiple Step Methods

249

250

```python

251

with pm.Model() as model:

252

# Continuous and discrete variables

253

mu = pm.Normal('mu', 0, 1)

254

p = pm.Beta('p', 1, 1)

255

category = pm.Categorical('category', [0.3, 0.7])

256

257

# Compound step method

258

step1 = pm.NUTS([mu, p]) # HMC for continuous

259

step2 = pm.CategoricalGibbsMetropolis([category]) # Gibbs for categorical

260

step = pm.CompoundStep([step1, step2])

261

262

trace = pm.sample(1000, step=step)

263

```

264

265

### Custom Proposal Tuning

266

267

```python

268

with pm.Model() as model:

269

x = pm.Normal('x', 0, 1)

270

271

# Metropolis with custom proposal

272

proposal = pm.NormalProposal(scaling=0.5)

273

step = pm.Metropolis([x], proposal_dist=proposal, tune=True)

274

275

trace = pm.sample(1000, step=step, tune=500)

276

```