Probabilistic Programming in Python: Bayesian Modeling and Probabilistic Machine Learning with Theano
Agent Success
Agent success rate when using this tile
68%
Improvement
Agent success rate improvement when using this tile compared to baseline
0.94x
Baseline
Agent success rate without this tile
72%
Implement a small Bayesian estimator for a single coin's bias that relies on the dependency's automatic MCMC step selection. The goal is to expose how efficiently the solver can build a simple model, let the package choose appropriate samplers, and return usable posterior summaries.
draws posterior samples. @testheads=[8, 9, 10], tails=[2, 1, 0], draws=600, tune=300, and random_seed=123, returns a posterior mean between 0.85 and 0.95 and identical mean when rerun with the same seed. @testsamples, mean, and hdi, where samples contains floating-point draws in [0,1], mean is the arithmetic mean of samples, and hdi is a tuple of lower/upper 0.94 highest density interval bounds with lower <= upper. @test@generates
from typing import List, Tuple, Dict, Optional
import numpy as np
def infer_coin_bias(heads: List[int], tails: List[int], draws: int = 1000, tune: int = 500, random_seed: Optional[int] = None) -> Dict[str, object]:
"""
heads and tails contain counts of flips per batch; builds a Bernoulli model with Beta(2,2) prior.
Runs automatic MCMC step selection to sample the bias parameter.
Returns a dictionary with:
- "samples": 1D numpy array of posterior draws for the bias
- "mean": float posterior mean of the bias
- "hdi": Tuple[float, float] containing the lower and upper bounds of the 94% HDI
"""Probabilistic modeling with automatic sampler assignment for MCMC.
Array operations for input validation and summaries.
docs
evals
scenario-1
scenario-2
scenario-3
scenario-4
scenario-5
scenario-6
scenario-7
scenario-8
scenario-9
scenario-10