CtrlK
CommunityDocumentationLog inGet started
Tessl Logo

tessl/pypi-pymc3

Probabilistic Programming in Python: Bayesian Modeling and Probabilistic Machine Learning with Theano

Agent Success

Agent success rate when using this tile

68%

Improvement

Agent success rate improvement when using this tile compared to baseline

0.94x

Baseline

Agent success rate without this tile

72%

Overview
Eval results
Files

task.mdevals/scenario-6/

Automatic MCMC Coin Bias

Implement a small Bayesian estimator for a single coin's bias that relies on the dependency's automatic MCMC step selection. The goal is to expose how efficiently the solver can build a simple model, let the package choose appropriate samplers, and return usable posterior summaries.

Capabilities

Posterior sampling with automatic step selection

  • Builds a coin-flip model with a Beta(2,2) prior on the bias and Bernoulli likelihood for batched head/tail counts, then runs MCMC using the dependency's automatic step selection (no manually chosen kernels) to produce at least draws posterior samples. @test
  • Using heads=[8, 9, 10], tails=[2, 1, 0], draws=600, tune=300, and random_seed=123, returns a posterior mean between 0.85 and 0.95 and identical mean when rerun with the same seed. @test

Posterior summaries

  • Returns a result dictionary containing samples, mean, and hdi, where samples contains floating-point draws in [0,1], mean is the arithmetic mean of samples, and hdi is a tuple of lower/upper 0.94 highest density interval bounds with lower <= upper. @test

Implementation

@generates

API

from typing import List, Tuple, Dict, Optional
import numpy as np

def infer_coin_bias(heads: List[int], tails: List[int], draws: int = 1000, tune: int = 500, random_seed: Optional[int] = None) -> Dict[str, object]:
    """
    heads and tails contain counts of flips per batch; builds a Bernoulli model with Beta(2,2) prior.
    Runs automatic MCMC step selection to sample the bias parameter.
    Returns a dictionary with:
    - "samples": 1D numpy array of posterior draws for the bias
    - "mean": float posterior mean of the bias
    - "hdi": Tuple[float, float] containing the lower and upper bounds of the 94% HDI
    """

Dependencies { .dependencies }

pymc { .dependency }

Probabilistic modeling with automatic sampler assignment for MCMC.

numpy { .dependency }

Array operations for input validation and summaries.

tile.json