CtrlK
CommunityDocumentationLog inGet started
Tessl Logo

tessl/pypi-pymc3

Probabilistic Programming in Python: Bayesian Modeling and Probabilistic Machine Learning with Theano

Agent Success

Agent success rate when using this tile

68%

Improvement

Agent success rate improvement when using this tile compared to baseline

0.94x

Baseline

Agent success rate without this tile

72%

Overview
Eval results
Files

task.mdevals/scenario-7/

Variational Logistic Regression

Build a binary classifier using the package's variational inference suite with both diagonal and full-covariance approximations. Compare the approximations by their evidence lower bound (ELBO) progress and use the better one for predictions.

Capabilities

Runs dual approximations

  • Calling fit_variational_logistic_regression performs both mean-field-style and full-covariance-style variational optimizations for the same logistic model, each for vi_steps iterations, returning ELBO traces for both. @test

Deterministic with seed

  • Supplying random_seed produces identical ELBO traces and predictive probabilities across repeated calls on the same data. @test

Predictive separation

  • On a linearly separable training set with negative features labeled 0 and positive features labeled 1, predictions on X_test=[[-2.0], [2.0]] yield probabilities < 0.2 for the negative point and > 0.8 for the positive point when using the better-performing approximation (higher final ELBO). @test

Reports chosen approximation

  • Return payload includes the name of the approximation used for predictions ("mean_field" or "full_rank") and the corresponding ELBO history. @test

Implementation

@generates

API

from typing import Literal, TypedDict
import numpy as np

class VariationalResult(TypedDict):
    approximation: Literal["mean_field", "full_rank"]
    mean_field_elbo: list[float]
    full_rank_elbo: list[float]
    test_probabilities: np.ndarray

def fit_variational_logistic_regression(
    X_train: np.ndarray,
    y_train: np.ndarray,
    X_test: np.ndarray,
    *,
    vi_steps: int = 1000,
    draws: int = 1000,
    random_seed: int | None = None,
) -> VariationalResult:
    """
    Build a one-dimensional logistic regression model with an intercept and slope,
    run both diagonal and dense-covariance variational fits,
    pick the approximation with the higher final ELBO,
    draw posterior predictive probabilities for X_test,
    and return the traces and chosen approximation label.
    """

Dependencies { .dependencies }

pymc { .dependency }

Provides probabilistic modeling and variational inference.

tile.json