or run

npx @tessl/cli init
Log in

Version

Tile

Overview

Evals

Files

Files

docs

callbacks.mdcollaborative-filtering.mdcore-training.mddata-loading.mdindex.mdinterpretation.mdmedical.mdmetrics-losses.mdtabular.mdtext.mdvision.md

index.mddocs/

0

# fastai

1

2

A comprehensive deep learning library that simplifies training fast and accurate neural networks using modern best practices. Built on PyTorch, fastai provides high-level components that can quickly and easily provide state-of-the-art results in standard deep learning domains, and provides researchers with low-level components that can be mixed and matched to build new approaches.

3

4

## Package Information

5

6

- **Package Name**: fastai

7

- **Package Type**: pypi

8

- **Language**: Python

9

- **Installation**: `pip install fastai`

10

- **Requirements**: Python 3.10+, PyTorch 1.10+

11

12

## Core Imports

13

14

The main import patterns for fastai depend on the domain:

15

16

For vision tasks:

17

```python

18

from fastai.vision.all import *

19

```

20

21

For text tasks:

22

```python

23

from fastai.text.all import *

24

```

25

26

For tabular tasks:

27

```python

28

from fastai.tabular.all import *

29

```

30

31

For basic functionality:

32

```python

33

from fastai.basics import *

34

```

35

36

For collaborative filtering:

37

```python

38

from fastai.collab import *

39

```

40

41

## Basic Usage

42

43

```python

44

from fastai.vision.all import *

45

46

# Download a sample dataset

47

path = untar_data(URLs.PETS)

48

49

# Create a data loader for image classification

50

dls = ImageDataLoaders.from_name_func(

51

path, get_image_files(path),

52

valid_pct=0.2, seed=42,

53

label_func=lambda x: x[0].isupper(),

54

item_tfms=Resize(224))

55

56

# Create a learner with a pre-trained model

57

learn = vision_learner(dls, resnet34, metrics=error_rate)

58

59

# Train the model

60

learn.fine_tune(4)

61

62

# Make predictions

63

pred_class, pred_idx, outputs = learn.predict(path/'images'/'test_image.jpg')

64

```

65

66

## Architecture

67

68

fastai is built around several key architectural concepts:

69

70

- **Learner**: Central training class that coordinates model, data, optimizer, and callbacks

71

- **DataLoaders**: Manages training and validation data with transforms

72

- **DataBlock API**: Flexible data processing pipeline construction

73

- **Transform Pipelines**: Modular data preprocessing and augmentation

74

- **Callback System**: Extensible training loop customization

75

- **Domain-Specific APIs**: Specialized interfaces for vision, text, tabular, and collaborative filtering

76

77

The library follows a layered API design where high-level convenience functions build on lower-level flexible components, allowing both rapid prototyping and advanced customization.

78

79

## Capabilities

80

81

### Core Training Infrastructure

82

83

Central training and learning infrastructure including the main Learner class, metrics, optimization, and model management utilities.

84

85

```python { .api }

86

class Learner:

87

def __init__(self, dls, model, loss_func=None, opt_func=Adam, lr=0.001, **kwargs): ...

88

def fit(self, n_epoch, lr=None, wd=None, cbs=None): ...

89

def fine_tune(self, epochs, base_lr=2e-3, freeze_epochs=1, **kwargs): ...

90

def predict(self, item, with_input=False): ...

91

92

def load_learner(path, cpu=True): ...

93

def vision_learner(dls, arch, normalize=True, n_out=None, **kwargs): ...

94

def text_classifier_learner(dls, arch, seq_len=72, **kwargs): ...

95

def tabular_learner(dls, layers=None, emb_szs=None, **kwargs): ...

96

```

97

98

[Core Training](./core-training.md)

99

100

### Data Loading and Processing

101

102

Comprehensive data loading system with the DataBlock API, transforms, and domain-specific data loaders for flexible data pipeline construction.

103

104

```python { .api }

105

class DataLoaders:

106

def __init__(self, *loaders): ...

107

@classmethod

108

def from_dblock(cls, dblock, source, **kwargs): ...

109

110

class DataBlock:

111

def __init__(self, blocks=None, dl_type=None, getters=None, n_inp=None, **kwargs): ...

112

def dataloaders(self, source, **kwargs): ...

113

114

def ImageDataLoaders.from_folder(path, valid_pct=0.2, **kwargs): ...

115

def TextDataLoaders.from_folder(path, valid='valid', **kwargs): ...

116

def TabularDataLoaders.from_csv(path, y_names, **kwargs): ...

117

```

118

119

[Data Loading](./data-loading.md)

120

121

### Computer Vision

122

123

Complete computer vision toolkit including pre-trained models, data augmentation, specialized learners for classification and segmentation, and vision-specific utilities.

124

125

```python { .api }

126

def vision_learner(dls, arch, normalize=True, n_out=None, **kwargs): ...

127

def unet_learner(dls, arch, normalize=True, **kwargs): ...

128

129

class ImageDataLoaders:

130

@classmethod

131

def from_folder(cls, path, train='train', valid='valid', **kwargs): ...

132

@classmethod

133

def from_name_func(cls, path, fnames, label_func, **kwargs): ...

134

135

def aug_transforms(mult=1.0, do_flip=True, flip_vert=False, **kwargs): ...

136

```

137

138

[Computer Vision](./vision.md)

139

140

### Natural Language Processing

141

142

Text processing and NLP capabilities including language models, text classification, tokenization, and text-specific data processing.

143

144

```python { .api }

145

def language_model_learner(dls, arch, config=None, **kwargs): ...

146

def text_classifier_learner(dls, arch, seq_len=72, **kwargs): ...

147

148

class TextDataLoaders:

149

@classmethod

150

def from_folder(cls, path, valid='valid', **kwargs): ...

151

@classmethod

152

def from_csv(cls, path, text_col='text', label_col='label', **kwargs): ...

153

154

class WordTokenizer: ...

155

class SubwordTokenizer: ...

156

```

157

158

[Natural Language Processing](./text.md)

159

160

### Tabular Data

161

162

Tabular data processing and modeling including preprocessing transforms, neural network architectures designed for structured data, and tabular-specific utilities.

163

164

```python { .api }

165

def tabular_learner(dls, layers=None, emb_szs=None, n_out=None, **kwargs): ...

166

167

class TabularDataLoaders:

168

@classmethod

169

def from_csv(cls, path, y_names, cat_names=None, cont_names=None, **kwargs): ...

170

171

class Categorify: ...

172

class FillMissing: ...

173

class Normalize: ...

174

```

175

176

[Tabular Data](./tabular.md)

177

178

### Collaborative Filtering

179

180

Recommendation system capabilities including specialized learners and models for collaborative filtering tasks.

181

182

```python { .api }

183

def collab_learner(dls, n_factors=50, **kwargs): ...

184

185

class CollabDataLoaders:

186

@classmethod

187

def from_csv(cls, path, user_name=None, item_name=None, **kwargs): ...

188

189

class EmbeddingDotBias: ...

190

```

191

192

[Collaborative Filtering](./collaborative-filtering.md)

193

194

### Callbacks and Training Customization

195

196

Extensive callback system for customizing the training loop including progress tracking, learning rate scheduling, regularization, and logging.

197

198

```python { .api }

199

class Callback:

200

def before_fit(self): ...

201

def before_epoch(self): ...

202

def before_batch(self): ...

203

204

class MixedPrecision(Callback): ...

205

class OneCycleTraining(Callback): ...

206

class EarlyStoppingCallback(Callback): ...

207

class SaveModelCallback(Callback): ...

208

```

209

210

[Callbacks](./callbacks.md)

211

212

### Metrics and Loss Functions

213

214

Comprehensive metrics for evaluating model performance and loss functions for training across different domains and tasks.

215

216

```python { .api }

217

def accuracy(inp, targ): ...

218

def error_rate(inp, targ): ...

219

def top_k_accuracy(inp, targ, k=5): ...

220

221

class CrossEntropyLossFlat: ...

222

class MSELossFlat: ...

223

class FocalLoss: ...

224

```

225

226

[Metrics and Losses](./metrics-losses.md)

227

228

### Model Interpretation

229

230

Tools for understanding and interpreting model predictions including visualization utilities and analysis methods.

231

232

```python { .api }

233

class ClassificationInterpretation:

234

@classmethod

235

def from_learner(cls, learn, **kwargs): ...

236

def plot_confusion_matrix(self, **kwargs): ...

237

def plot_top_losses(self, k, **kwargs): ...

238

```

239

240

[Model Interpretation](./interpretation.md)

241

242

### Medical Imaging

243

244

Specialized tools for working with DICOM medical imaging files including CT scans, MRI, X-rays, and other medical imaging modalities with proper windowing, normalization, and processing.

245

246

```python { .api }

247

def get_dicom_files(path, recurse=True, folders=None): ...

248

def dcmread(fn, force=False): ...

249

250

class DicomSegmentationDataLoaders(DataLoaders):

251

@classmethod

252

def from_label_func(cls, path, fnames, label_func, **kwargs): ...

253

254

class TensorDicom(TensorImage): ...

255

class PILDicom(PILBase): ...

256

257

# Predefined medical windows

258

dicom_windows = SimpleNamespace(

259

brain=(80,40), subdural=(254,100), stroke=(8,32),

260

brain_bone=(2800,600), lungs=(1500,-600), liver=(150,30)

261

)

262

```

263

264

[Medical Imaging](./medical.md)