CtrlK
BlogDocsLog inGet started
Tessl Logo

tessl/pypi-pymc3

Probabilistic Programming in Python: Bayesian Modeling and Probabilistic Machine Learning with Theano

68

0.94x
Overview
Eval results
Files

step-methods.mddocs/

MCMC Step Methods

PyMC3 provides a comprehensive suite of MCMC step methods for sampling from posterior distributions. Step methods define how the sampler moves through parameter space, with automatic step method assignment based on variable types and model structure.

Capabilities

Hamiltonian Monte Carlo Methods

Advanced gradient-based samplers that use gradient information to efficiently explore high-dimensional parameter spaces.

class NUTS:
    """
    No-U-Turn Sampler - adaptive Hamiltonian Monte Carlo variant.
    
    Parameters:
    - vars: list, variables to sample (all continuous vars if None)
    - target_accept: float, target acceptance probability (default 0.8)
    - max_treedepth: int, maximum tree depth (default 10)
    - step_scale: float, initial step size scaling
    - is_cov: bool, treat step_scale as covariance matrix
    - model: Model, model to sample from
    - **kwargs: additional sampler arguments
    """

class HamiltonianMC:
    """
    Hamiltonian Monte Carlo sampler with fixed step size and path length.
    
    Parameters:
    - vars: list, variables to sample (all continuous vars if None)
    - path_length: float, length of Hamiltonian trajectory
    - step_rand: function, step size randomization function
    - step_scale: float, step size scaling
    - is_cov: bool, treat step_scale as covariance matrix
    - model: Model, model to sample from
    - **kwargs: additional sampler arguments
    """

Metropolis Methods

Random-walk Metropolis samplers with various proposal distributions for different variable types.

class Metropolis:
    """
    General Metropolis-Hastings sampler with configurable proposals.
    
    Parameters:
    - vars: list, variables to sample (all vars if None)
    - S: array or matrix, proposal covariance or scaling
    - proposal_dist: function, proposal distribution
    - scaling: float, proposal scaling factor
    - tune: bool, automatically tune proposal during sampling
    - tune_interval: int, tuning interval in samples
    - model: Model, model to sample from
    """

class BinaryMetropolis:
    """
    Metropolis sampler for binary variables using bit flipping.
    
    Parameters:
    - vars: list, binary variables to sample
    - scaling: float, probability of proposing a flip
    - tune: bool, automatically tune scaling
    - model: Model, model to sample from
    """

class BinaryGibbsMetropolis:
    """
    Gibbs sampler for binary variables using conditional distributions.
    
    Parameters:
    - vars: list, binary variables to sample
    - order: str, variable update order ('random' or 'fixed')
    - model: Model, model to sample from
    """

class CategoricalGibbsMetropolis:
    """
    Gibbs sampler for categorical variables.
    
    Parameters:
    - vars: list, categorical variables to sample
    - model: Model, model to sample from
    """

class DEMetropolis:
    """
    Differential Evolution Metropolis for efficient parallel sampling.
    
    Parameters:
    - vars: list, variables to sample
    - lamb: float, differential evolution parameter
    - tune: str, tuning method ('lambda' or 'scaling')
    - tune_interval: int, tuning interval
    - model: Model, model to sample from
    """

class DEMetropolisZ:
    """
    DE-MCMC-Z sampler using past chain history for proposals.
    
    Parameters:
    - vars: list, variables to sample
    - lamb: float, differential evolution parameter  
    - tune: str, tuning method
    - tune_interval: int, tuning interval
    - model: Model, model to sample from
    """

Proposal Distributions

Configurable proposal distributions for Metropolis samplers.

class NormalProposal:
    """Normal proposal distribution for continuous variables."""

class CauchyProposal:
    """Cauchy proposal distribution with heavy tails."""

class LaplaceProposal:
    """Laplace proposal distribution."""

class PoissonProposal:
    """Poisson proposal distribution for count data."""

class UniformProposal:
    """Uniform proposal distribution."""

class MultivariateNormalProposal:
    """Multivariate normal proposal for correlated variables."""

Specialized Samplers

Advanced samplers for specific model types and sampling scenarios.

class Slice:
    """
    Slice sampler for univariate continuous variables.
    
    Parameters:
    - vars: list, variables to sample
    - w: float or array, initial bracket width
    - tune: bool, automatically tune bracket width
    - model: Model, model to sample from
    """

class EllipticalSlice:
    """
    Elliptical slice sampler for variables with Gaussian priors.
    
    Parameters:
    - vars: list, variables with Gaussian priors
    - prior_cov: array, prior covariance matrix
    - model: Model, model to sample from
    """

class ElemwiseCategorical:
    """
    Element-wise Gibbs sampler for categorical variables.
    
    Parameters:
    - vars: list, categorical variables to sample
    - values: list, possible values for each variable
    - model: Model, model to sample from
    """

class PGBART:
    """
    Particle Gibbs sampler for Bayesian Additive Regression Trees.
    
    Parameters:
    - vars: list, BART variables to sample
    - num_particles: int, number of particles
    - batch: bool, use batch updates
    - model: Model, model to sample from
    """

class CompoundStep:
    """
    Compound step method that combines multiple step methods.
    
    Parameters:
    - methods: list, step methods to combine
    - model: Model, model to sample from
    """

Multi-Level Samplers

Specialized samplers for hierarchical and multi-level models.

class MLDA:
    """
    Multi-Level Delayed Acceptance sampler for hierarchical models.
    
    Parameters:
    - coarse_models: list, coarse approximation models
    - base_sampler: step method for fine level
    - base_scaling: float, scaling for base sampler
    - model: Model, fine-level model to sample from
    """

class MetropolisMLDA:
    """MLDA with Metropolis base sampler."""

class DEMetropolisZMLDA:
    """MLDA with DE-MCMC-Z base sampler."""

class RecursiveDAProposal:
    """
    Recursive delayed acceptance proposal for MLDA.
    
    Parameters:
    - coarse_models: list, hierarchy of coarse models
    - base_proposal: proposal for finest level
    """

Usage Examples

Basic Step Method Assignment

import pymc3 as pm

with pm.Model() as model:
    # Define model variables
    mu = pm.Normal('mu', 0, 1)
    sigma = pm.HalfNormal('sigma', 1)
    y = pm.Normal('y', mu, sigma, observed=data)
    
    # Automatic step method assignment
    trace = pm.sample(1000)  # NUTS for continuous vars
    
    # Manual step method specification
    step = pm.NUTS([mu, sigma])
    trace = pm.sample(1000, step=step)

Multiple Step Methods

with pm.Model() as model:
    # Continuous and discrete variables
    mu = pm.Normal('mu', 0, 1)
    p = pm.Beta('p', 1, 1)
    category = pm.Categorical('category', [0.3, 0.7])
    
    # Compound step method
    step1 = pm.NUTS([mu, p])  # HMC for continuous
    step2 = pm.CategoricalGibbsMetropolis([category])  # Gibbs for categorical
    step = pm.CompoundStep([step1, step2])
    
    trace = pm.sample(1000, step=step)

Custom Proposal Tuning

with pm.Model() as model:
    x = pm.Normal('x', 0, 1)
    
    # Metropolis with custom proposal
    proposal = pm.NormalProposal(scaling=0.5)
    step = pm.Metropolis([x], proposal_dist=proposal, tune=True)
    
    trace = pm.sample(1000, step=step, tune=500)

Install with Tessl CLI

npx tessl i tessl/pypi-pymc3

docs

data-handling.md

distributions.md

gaussian-processes.md

glm.md

index.md

math-functions.md

modeling.md

sampling.md

stats-plots.md

step-methods.md

variational.md

tile.json