CtrlK
BlogDocsLog inGet started
Tessl Logo

tessl/maven-com-github-haifengl--smile-core

Statistical Machine Intelligence and Learning Engine providing comprehensive machine learning algorithms for classification, regression, clustering, and feature engineering in Java

Pending
Overview
Eval results
Files

classification.mddocs/

Classification

Comprehensive supervised learning algorithms for predicting categorical outcomes. Smile Core provides implementations of traditional algorithms like decision trees and support vector machines, ensemble methods like random forests and gradient boosting, and neural network approaches.

Capabilities

Core Classification Interface

All classification algorithms implement the unified Classifier<T> interface, providing consistent prediction methods and optional online learning support.

/**
 * Base classification interface for all supervised learning algorithms
 * @param <T> the type of input objects
 */
interface Classifier<T> extends ToIntFunction<T>, ToDoubleFunction<T>, Serializable {
    /** Predict the class label for input */
    int predict(T x);
    
    /** Predict class label and return class probabilities */
    int predict(T x, double[] posteriori);
    
    /** Get number of classes */
    default int numClasses();
    
    /** Get class labels array */
    default int[] classes();
    
    /** Online learning update (if supported) */
    default void update(T x, int y);
    
    /** Create ensemble of multiple classifiers */
    static <T> Classifier<T> ensemble(Classifier<T>... classifiers);
}

Random Forest

Ensemble method combining multiple decision trees with bootstrap sampling and random feature selection.

/**
 * Random Forest classifier implementing ensemble of decision trees
 */
class RandomForest implements Classifier<Tuple>, DataFrameClassifier, TreeSHAP {
    /** Train random forest with formula on DataFrame */
    public static RandomForest fit(Formula formula, DataFrame data);
    
    /** Train with custom parameters */
    public static RandomForest fit(Formula formula, DataFrame data, Properties params);
    
    /** Predict class label for Tuple */
    public int predict(Tuple x);
    
    /** Predict with class probabilities for Tuple */
    public int predict(Tuple x, double[] posteriori);
    
    /** Calculate SHAP values for feature importance */
    public double[] shap(Tuple x);
    
    /** Get out-of-bag error estimate */
    public double error();
    
    /** Get feature importance scores */
    public double[] importance();
    
    /** Trim forest to specified number of trees */
    public RandomForest trim(int ntrees);
    
    /** Merge with another random forest */
    public RandomForest merge(RandomForest other);
    
    /** Prune forest using test data */
    public RandomForest prune(DataFrame test);
}

Usage Example:

import smile.classification.RandomForest;
import smile.data.DataFrame;
import smile.data.formula.Formula;

// Train on DataFrame
Formula formula = Formula.lhs("species");
RandomForest model = RandomForest.fit(formula, irisData);

// Predict new samples (using Tuple from DataFrame)
int prediction = model.predict(newTuple);

// Get prediction probabilities
double[] probabilities = new double[3];
int predicted = model.predict(newTuple, probabilities);

// Get SHAP values for feature importance
double[] shapValues = model.shap(newTuple);

Decision Tree

Single decision tree classifier using CART (Classification and Regression Trees) algorithm.

/**
 * Decision tree classifier using CART algorithm
 */
class DecisionTree implements Classifier<double[]>, DataFrameClassifier {
    /** Train decision tree with default parameters */
    public static DecisionTree fit(double[][] x, int[] y);
    
    /** Train decision tree with formula on DataFrame */
    public static DecisionTree fit(Formula formula, DataFrame data);
    
    /** Train with custom split rule and parameters */
    public static DecisionTree fit(double[][] x, int[] y, SplitRule rule, int maxDepth, int maxNodes, int nodeSize);
    
    /** Predict class label */
    public int predict(double[] x);
    
    /** Get tree structure as string */
    public String toString();
    
    /** Get feature importance scores */
    public double[] importance();
}

Support Vector Machine

Support Vector Machine classifier with various kernel functions and multi-class support.

/**
 * Support Vector Machine classifier
 */
class SVM implements Classifier<double[]> {
    /** Train linear SVM */
    public static SVM fit(double[][] x, int[] y);
    
    /** Train SVM with RBF kernel */
    public static SVM fit(double[][] x, int[] y, double gamma);
    
    /** Train SVM with custom kernel and parameters */
    public static SVM fit(double[][] x, int[] y, Kernel kernel, double C, double tol);
    
    /** Predict class label */
    public int predict(double[] x);
    
    /** Get support vectors */
    public SupportVector[] supportVectors();
    
    /** Get number of support vectors */
    public int numSupportVectors();
}

Logistic Regression

Linear classifier using logistic regression with L1/L2 regularization options.

/**
 * Logistic regression classifier
 */
class LogisticRegression implements Classifier<double[]> {
    /** Train logistic regression */
    public static LogisticRegression fit(double[][] x, int[] y);
    
    /** Train with regularization parameters */
    public static LogisticRegression fit(double[][] x, int[] y, double lambda, double tolerance, int maxIter);
    
    /** Predict class label */
    public int predict(double[] x);
    
    /** Predict with class probabilities */
    public int predict(double[] x, double[] posteriori);
    
    /** Get model coefficients */
    public double[] coefficients();
    
    /** Get intercept term */
    public double intercept();
}

Naive Bayes Classifiers

Family of probabilistic classifiers based on Bayes' theorem with independence assumptions.

/**
 * Gaussian Naive Bayes classifier for continuous features
 */
class NaiveBayes implements Classifier<double[]> {
    /** Train Gaussian Naive Bayes */
    public static NaiveBayes fit(double[][] x, int[] y);
    
    /** Train with Laplace smoothing */
    public static NaiveBayes fit(double[][] x, int[] y, Model model, int numClasses, double sigma);
    
    /** Predict class label */
    public int predict(double[] x);
    
    /** Online learning update */
    public void update(double[] x, int y);
}

/**
 * Discrete Naive Bayes for categorical features
 */
class DiscreteNaiveBayes implements Classifier<int[]> {
    enum Model { BERNOULLI, MULTINOMIAL, CNB, WCNB, TWCNB }
    
    /** Train discrete Naive Bayes */
    public static DiscreteNaiveBayes fit(int[][] x, int[] y, Model model);
    
    /** Predict class label */
    public int predict(int[] x);
    
    /** Online learning update */
    public void update(int[] x, int y);
}

Neural Networks

Multi-layer perceptron classifier with configurable architecture and training options.

/**
 * Multi-Layer Perceptron classifier
 */
class MLP implements Classifier<double[]> {
    /** Train MLP with default architecture */
    public static MLP fit(double[][] x, int[] y);
    
    /** Train MLP with custom architecture */
    public static MLP fit(double[][] x, int[] y, int[] hiddenLayers, ActivationFunction activation);
    
    /** Train with full configuration */
    public static MLP fit(double[][] x, int[] y, Properties params);
    
    /** Predict class label */
    public int predict(double[] x);
    
    /** Online learning update */
    public void update(double[] x, int y);
    
    /** Get network weights */
    public double[][] getWeights(int layer);
}

Ensemble Methods

Advanced ensemble techniques for improved prediction accuracy and robustness.

/**
 * Adaptive Boosting (AdaBoost) classifier
 */
class AdaBoost implements Classifier<double[]> {
    /** Train AdaBoost with decision stumps */
    public static AdaBoost fit(double[][] x, int[] y);
    
    /** Train with custom weak learner and iterations */
    public static AdaBoost fit(double[][] x, int[] y, int numTrees, int maxDepth);
    
    /** Predict class label */
    public int predict(double[] x);
    
    /** Get weak learner weights */
    public double[] importance();
}

/**
 * Gradient Tree Boosting classifier
 */
class GradientTreeBoost implements Classifier<double[]>, DataFrameClassifier {
    /** Train gradient boosting */
    public static GradientTreeBoost fit(double[][] x, int[] y);
    
    /** Train with formula on DataFrame */
    public static GradientTreeBoost fit(Formula formula, DataFrame data);
    
    /** Train with custom parameters */
    public static GradientTreeBoost fit(double[][] x, int[] y, int numTrees, int maxDepth, double shrinkage);
    
    /** Predict class label */
    public int predict(double[] x);
    
    /** Get feature importance */
    public double[] importance();
}

K-Nearest Neighbors

Instance-based classifier using k-nearest neighbors for prediction.

/**
 * K-Nearest Neighbors classifier
 */
class KNN implements Classifier<double[]> {
    /** Train KNN classifier */
    public static KNN fit(double[][] x, int[] y, int k);
    
    /** Train with custom distance metric */
    public static KNN fit(double[][] x, int[] y, int k, Distance<double[]> distance);
    
    /** Predict class label */
    public int predict(double[] x);
    
    /** Predict with neighbor distances */
    public int predict(double[] x, double[] distances);
    
    /** Get k parameter */
    public int k();
}

Multi-class Strategies

Techniques for extending binary classifiers to multi-class problems.

/**
 * One-versus-one multi-class strategy
 */
class OneVersusOne implements Classifier<double[]> {
    /** Train OvO with binary classifier trainer */
    public static OneVersusOne fit(Classifier.Trainer<double[], ?> trainer, double[][] x, int[] y);
    
    /** Predict class label */
    public int predict(double[] x);
}

/**
 * One-versus-rest multi-class strategy
 */
class OneVersusRest implements Classifier<double[]> {
    /** Train OvR with binary classifier trainer */
    public static OneVersusRest fit(Classifier.Trainer<double[], ?> trainer, double[][] x, int[] y);
    
    /** Predict class label */
    public int predict(double[] x);
}

Linear Discriminant Analysis

Linear and quadratic discriminant analysis for Gaussian-distributed classes.

/**
 * Linear Discriminant Analysis
 */
class LDA implements Classifier<double[]> {
    /** Train LDA classifier */
    public static LDA fit(double[][] x, int[] y);
    
    /** Train with prior probabilities */
    public static LDA fit(double[][] x, int[] y, double[] priori);
    
    /** Predict class label */
    public int predict(double[] x);
    
    /** Get discriminant projection */
    public double[] project(double[] x);
}

/**
 * Quadratic Discriminant Analysis
 */
class QDA implements Classifier<double[]> {
    /** Train QDA classifier */
    public static QDA fit(double[][] x, int[] y);
    
    /** Train with prior probabilities */
    public static QDA fit(double[][] x, int[] y, double[] priori);
    
    /** Predict class label */
    public int predict(double[] x);
}

/**
 * Regularized Discriminant Analysis
 */
class RDA implements Classifier<double[]> {
    /** Train RDA with regularization parameters */
    public static RDA fit(double[][] x, int[] y, double alpha, double gamma);
    
    /** Predict class label */
    public int predict(double[] x);
}

Utility Classes

Helper classes for classification tasks and probability calibration.

/**
 * Class label encoding utilities
 */
class ClassLabels {
    /** Fit encoder from class labels */
    public static ClassLabels fit(int[] y);
    
    /** Encoded class labels */
    public final int[] classes;
    
    /** Number of classes */
    public final int numClasses;
}

/**
 * Platt scaling for probability calibration
 */
class PlattScaling {
    /** Fit Platt scaling from classifier outputs */
    public static PlattScaling fit(double[] scores, int[] y);
    
    /** Apply calibration to classifier output */
    public double calibrate(double score);
}

/**
 * Isotonic regression scaling for probability calibration
 */
class IsotonicRegressionScaling {
    /** Fit isotonic regression scaling */
    public static IsotonicRegressionScaling fit(double[] scores, int[] y);
    
    /** Apply calibration to classifier output */
    public double calibrate(double score);
}

Training Patterns

All classifiers follow consistent training patterns:

Array-based training:

Classifier model = Algorithm.fit(double[][] x, int[] y);
Classifier model = Algorithm.fit(double[][] x, int[] y, Properties params);

DataFrame-based training:

Classifier model = Algorithm.fit(Formula formula, DataFrame data);

Prediction patterns:

int prediction = model.predict(double[] x);
int prediction = model.predict(double[] x, double[] posteriori);

Common Parameters

Most classification algorithms support these common configuration options:

  • maxDepth: Maximum tree depth (tree-based algorithms)
  • numTrees: Number of trees in ensemble
  • nodeSize: Minimum samples per leaf node
  • subsample: Fraction of samples for bootstrap
  • mtry: Number of features to consider at each split
  • splitRule: Splitting criterion (GINI, ENTROPY, CLASSIFICATION_ERROR)
  • seed: Random seed for reproducibility

Install with Tessl CLI

npx tessl i tessl/maven-com-github-haifengl--smile-core

docs

advanced-analytics.md

classification.md

clustering.md

deep-learning.md

feature-engineering.md

index.md

regression.md

validation-metrics.md

tile.json