or run

npx @tessl/cli init
Log in

Version

Tile

Overview

Evals

Files

Files

docs

classification.mdclustering.mddata-handling.mddistance.mdevaluation.mdindex.mdpreprocessing.mdprojection.mdregression.mdwidgets.md

classification.mddocs/

0

# Classification Algorithms

1

2

Orange3 provides a comprehensive collection of supervised learning algorithms for categorical prediction tasks. All classifiers follow a consistent pattern: create a learner, then call it with training data to produce a trained model.

3

4

## Capabilities

5

6

### Decision Trees

7

8

Tree-based classification algorithms that create interpretable decision rules.

9

10

```python { .api }

11

class TreeLearner:

12

"""

13

Decision tree classifier.

14

15

Args:

16

criterion: Split criterion ('gini', 'entropy')

17

max_depth: Maximum tree depth

18

min_samples_split: Minimum samples required to split

19

min_samples_leaf: Minimum samples in leaf nodes

20

"""

21

def __init__(self, criterion=None, max_depth=None,

22

min_samples_split=2, min_samples_leaf=1): ...

23

24

def __call__(self, data):

25

"""Train and return classification model."""

26

27

class SimpleTreeLearner:

28

"""Fast, simplified decision tree implementation."""

29

def __init__(self, max_depth=None, min_samples_split=5): ...

30

31

def __call__(self, data):

32

"""Train and return tree model."""

33

```

34

35

### Logistic Regression

36

37

Linear models for classification with probabilistic outputs.

38

39

```python { .api }

40

class LogisticRegressionLearner:

41

"""

42

Logistic regression classifier.

43

44

Args:

45

penalty: Regularization type ('l1', 'l2', 'elasticnet')

46

C: Inverse regularization strength

47

solver: Optimization algorithm

48

"""

49

def __init__(self, penalty='l2', C=1.0, solver='lbfgs'): ...

50

51

def __call__(self, data):

52

"""Train and return logistic regression model."""

53

54

class SoftmaxRegressionLearner:

55

"""Multi-class logistic regression."""

56

def __init__(self, penalty='l2', C=1.0): ...

57

58

def __call__(self, data):

59

"""Train and return softmax regression model."""

60

```

61

62

### Support Vector Machines

63

64

SVM-based classification with various kernel options.

65

66

```python { .api }

67

class SVMLearner:

68

"""

69

Support Vector Machine classifier.

70

71

Args:

72

kernel: Kernel type ('linear', 'poly', 'rbf', 'sigmoid')

73

C: Regularization parameter

74

gamma: Kernel coefficient

75

degree: Polynomial kernel degree

76

"""

77

def __init__(self, kernel='rbf', C=1.0, gamma='scale', degree=3): ...

78

79

def __call__(self, data):

80

"""Train and return SVM model."""

81

82

class LinearSVMLearner:

83

"""Linear Support Vector Machine."""

84

def __init__(self, C=1.0, dual=True): ...

85

86

def __call__(self, data):

87

"""Train and return linear SVM model."""

88

89

class NuSVMLearner:

90

"""Nu-Support Vector Machine."""

91

def __init__(self, nu=0.5, kernel='rbf', gamma='scale'): ...

92

93

def __call__(self, data):

94

"""Train and return Nu-SVM model."""

95

```

96

97

### Ensemble Methods

98

99

Ensemble algorithms that combine multiple learners for improved performance.

100

101

```python { .api }

102

class RandomForestLearner:

103

"""

104

Random Forest classifier.

105

106

Args:

107

n_estimators: Number of trees

108

max_depth: Maximum tree depth

109

max_features: Number of features per tree

110

bootstrap: Use bootstrap sampling

111

"""

112

def __init__(self, n_estimators=10, max_depth=None,

113

max_features='sqrt', bootstrap=True): ...

114

115

def __call__(self, data):

116

"""Train and return random forest model."""

117

118

class SimpleRandomForestLearner:

119

"""Optimized random forest implementation."""

120

def __init__(self, n_estimators=10, max_depth=3): ...

121

122

def __call__(self, data):

123

"""Train and return simple random forest model."""

124

125

class GBClassifier:

126

"""Gradient Boosting classifier."""

127

def __init__(self, n_estimators=100, learning_rate=0.1, max_depth=3): ...

128

129

def __call__(self, data):

130

"""Train and return gradient boosting model."""

131

```

132

133

### Probabilistic Classifiers

134

135

Algorithms based on probabilistic modeling.

136

137

```python { .api }

138

class NaiveBayesLearner:

139

"""

140

Naive Bayes classifier.

141

"""

142

def __call__(self, data):

143

"""Train and return Naive Bayes model."""

144

```

145

146

### Instance-Based Learning

147

148

k-Nearest Neighbors and related algorithms.

149

150

```python { .api }

151

class KNNLearner:

152

"""

153

k-Nearest Neighbors classifier.

154

155

Args:

156

n_neighbors: Number of neighbors

157

metric: Distance metric

158

weights: Weight function ('uniform', 'distance')

159

"""

160

def __init__(self, n_neighbors=5, metric='euclidean', weights='uniform'): ...

161

162

def __call__(self, data):

163

"""Train and return k-NN model."""

164

```

165

166

### Neural Networks

167

168

Multi-layer perceptron classifiers.

169

170

```python { .api }

171

class NNClassificationLearner:

172

"""

173

Neural network classifier.

174

175

Args:

176

hidden_layer_sizes: Tuple of hidden layer sizes

177

activation: Activation function

178

solver: Optimization solver

179

learning_rate_init: Initial learning rate

180

"""

181

def __init__(self, hidden_layer_sizes=(100,), activation='relu',

182

solver='adam', learning_rate_init=0.001): ...

183

184

def __call__(self, data):

185

"""Train and return neural network model."""

186

```

187

188

### Gradient Descent

189

190

Stochastic gradient descent-based classifiers.

191

192

```python { .api }

193

class SGDClassificationLearner:

194

"""

195

Stochastic Gradient Descent classifier.

196

197

Args:

198

loss: Loss function ('hinge', 'log', 'perceptron')

199

penalty: Regularization ('l1', 'l2', 'elasticnet')

200

alpha: Regularization strength

201

"""

202

def __init__(self, loss='hinge', penalty='l2', alpha=0.0001): ...

203

204

def __call__(self, data):

205

"""Train and return SGD model."""

206

```

207

208

### Baseline Classifiers

209

210

Simple baseline algorithms for comparison.

211

212

```python { .api }

213

class MajorityLearner:

214

"""Always predicts the majority class."""

215

def __call__(self, data):

216

"""Train and return majority class model."""

217

```

218

219

### Rule-Based Learning

220

221

Rule induction algorithms.

222

223

```python { .api }

224

class CN2Learner:

225

"""

226

CN2 rule learning algorithm.

227

228

Args:

229

rule_finder: Rule finding strategy

230

quality_evaluator: Rule quality measure

231

"""

232

def __init__(self, rule_finder=None, quality_evaluator=None): ...

233

234

def __call__(self, data):

235

"""Train and return CN2 rule model."""

236

237

class CN2UnorderedLearner:

238

"""CN2 algorithm producing unordered rules."""

239

def __init__(self): ...

240

241

def __call__(self, data):

242

"""Train and return unordered CN2 model."""

243

```

244

245

### Outlier Detection

246

247

Algorithms for identifying anomalous instances.

248

249

```python { .api }

250

class LocalOutlierFactorLearner:

251

"""

252

Local Outlier Factor for outlier detection.

253

254

Args:

255

n_neighbors: Number of neighbors

256

contamination: Expected proportion of outliers

257

"""

258

def __init__(self, n_neighbors=20, contamination=0.1): ...

259

260

def __call__(self, data):

261

"""Train and return LOF model."""

262

263

class IsolationForestLearner:

264

"""Isolation Forest for outlier detection."""

265

def __init__(self, n_estimators=100, contamination=0.1): ...

266

267

def __call__(self, data):

268

"""Train and return isolation forest model."""

269

270

class OneClassSVMLearner:

271

"""One-class SVM for outlier detection."""

272

def __init__(self, kernel='rbf', gamma='scale', nu=0.05): ...

273

274

def __call__(self, data):

275

"""Train and return one-class SVM model."""

276

```

277

278

### Model Calibration

279

280

Probability calibration for better confidence estimates.

281

282

```python { .api }

283

class CalibratedLearner:

284

"""

285

Calibrated classifier for better probability estimates.

286

287

Args:

288

base_learner: Base classification algorithm

289

method: Calibration method ('sigmoid', 'isotonic')

290

"""

291

def __init__(self, base_learner, method='sigmoid'): ...

292

293

def __call__(self, data):

294

"""Train and return calibrated model."""

295

296

class ThresholdLearner:

297

"""Threshold-based binary classifier."""

298

def __init__(self, threshold=0.5): ...

299

300

def __call__(self, data):

301

"""Train and return threshold model."""

302

```

303

304

### Usage Examples

305

306

```python

307

# Basic classification workflow

308

from Orange.data import Table

309

from Orange.classification import TreeLearner, LogisticRegressionLearner

310

from Orange.evaluation import CrossValidation, CA

311

312

# Load data

313

data = Table("iris")

314

315

# Create learners

316

tree = TreeLearner(max_depth=5)

317

logistic = LogisticRegressionLearner(C=1.0)

318

319

# Train models

320

tree_model = tree(data)

321

logistic_model = logistic(data)

322

323

# Make predictions

324

predictions = tree_model(data[:5])

325

probabilities = logistic_model(data[:5], logistic_model.Probs)

326

327

# Evaluate with cross-validation

328

results = CrossValidation(data, [tree, logistic], k=10)

329

accuracies = CA(results)

330

print(f"Tree accuracy: {accuracies[0]:.3f}")

331

print(f"Logistic accuracy: {accuracies[1]:.3f}")

332

333

# Ensemble example

334

from Orange.classification import RandomForestLearner

335

rf = RandomForestLearner(n_estimators=50, max_depth=10)

336

rf_model = rf(data)

337

338

# Rule learning example

339

from Orange.classification import CN2Learner

340

cn2 = CN2Learner()

341

rule_model = cn2(data)

342

```