CtrlK
CommunityDocumentationLog inGet started
Tessl Logo

tessl/pypi-pymc3

Probabilistic Programming in Python: Bayesian Modeling and Probabilistic Machine Learning with Theano

Agent Success

Agent success rate when using this tile

68%

Improvement

Agent success rate improvement when using this tile compared to baseline

0.94x

Baseline

Agent success rate without this tile

72%

Overview
Eval results
Files

task.mdevals/scenario-5/

HSGP 1D Regression

A module for building and using a Hilbert space approximate Gaussian process for scalar observations over a single continuous input dimension.

Capabilities

Model assembly

  • Given 1D inputs [0, 1, 2, 3] and matching targets [1.2, 1.4, 1.1, 0.9] plus basis_functions=25, lengthscale prior mean 1.0, variance prior 0.5, and noise prior 0.2, returns a probabilistic model whose latent function is represented with exactly 25 basis functions and whose coordinate metadata spans the input range expanded by the default grid scale. @test

Posterior sampling

  • Running inference for 50 warmup and 50 posterior draws on the returned model produces an inference object containing posterior draws for the latent function values at the training inputs and the observation noise scale; both must be strictly positive. @test

Posterior predictive

  • Given the inference object and new inputs [-0.5, 0.5, 1.5], generates predictive means and 94% central intervals matching the order of the new inputs and shaped (3,) for means and (3, 2) for intervals. @test

Prior predictive

  • Drawing 20 prior predictive samples for inputs [0.0, 0.25, 0.5] yields an array shaped (20, 3) with finite values. @test

Implementation

@generates

API

from typing import Iterable, Sequence, Tuple, Dict, Any
import numpy as np

def build_hsgp_model(
    x: Sequence[float],
    y: Sequence[float],
    *,
    basis_functions: int,
    lengthscale_prior: float,
    variance_prior: float,
    noise_prior: float,
    grid_scale: float = 1.2,
) -> Tuple[Any, Dict[str, Any]]:
    """
    Builds and returns a probabilistic model and its coordinate metadata for 1D regression
    using a stationary kernel approximated with a finite Hilbert basis.
    Inputs x and y must be same length; priors are positive scales; basis_functions > 0.
    """

def run_inference(model: Any, *, draws: int = 500, tune: int = 500, random_seed: int | None = None) -> Any:
    """
    Runs gradient-based MCMC for the provided model and returns an inference data object
    containing posterior draws for the latent function and observation noise scale.
    """

def posterior_predictive(
    model: Any,
    idata: Any,
    new_x: Iterable[float],
    *,
    hdi_prob: float = 0.94,
) -> Dict[str, np.ndarray]:
    """
    Uses posterior draws to compute predictive means and highest-density intervals at new_x.
    Returns a dict with keys 'mean' (shape (n_new,)) and 'hdi' (shape (n_new, 2)).
    """

def prior_predictive(model: Any, x_grid: Iterable[float], *, size: int = 100) -> np.ndarray:
    """
    Draws prior predictive samples at x_grid using the same truncated basis; returns array shaped (size, n_points).
    """

Dependencies { .dependencies }

pymc { .dependency }

Provides probabilistic programming primitives, Gaussian process components, and inference utilities.

tile.json