0
# Testing Framework
1
2
Run tests defined in notebook cells across your project in parallel. nbdev's testing system executes notebook cells marked as tests and provides detailed reporting on failures and performance.
3
4
## Capabilities
5
6
### Main Test Functions
7
8
Execute tests across all notebooks or specific notebook files.
9
10
```python { .api }
11
def nbdev_test(path: str = None, flags: str = None, n_workers: int = None,
12
verbose: bool = True, force_flags: str = None,
13
do_print: bool = False, timing: bool = False):
14
"""
15
Run tests from notebooks in parallel.
16
17
Args:
18
path: Path to notebook(s) to test (default: all notebooks in nbs_path)
19
flags: Comma-separated flags marking cells to skip during testing
20
n_workers: Number of parallel workers (default: number of CPUs)
21
verbose: Print detailed test results
22
force_flags: Flags marking cells to always run regardless of skip flags
23
do_print: Print test completion messages
24
timing: Display timing information for test execution
25
26
Executes all notebook cells except those marked with skip flags,
27
running notebooks in parallel for efficiency.
28
"""
29
30
def test_nb(fn: str, skip_flags: list = None, force_flags: list = None,
31
do_print: bool = False, showerr: bool = True, basepath: str = None):
32
"""
33
Execute tests in a single notebook.
34
35
Args:
36
fn: Path to notebook file to test
37
skip_flags: List of flags marking cells to skip
38
force_flags: List of flags marking cells to always run
39
do_print: Print completion messages
40
showerr: Print errors to stderr
41
basepath: Path to add to sys.path for imports
42
43
Returns:
44
Tuple of (success: bool, execution_time: float)
45
"""
46
```
47
48
**Usage Examples:**
49
50
```python
51
from nbdev.test import nbdev_test, test_nb
52
53
# Run all tests in project
54
nbdev_test()
55
56
# Run tests with specific skip flags
57
nbdev_test(flags='slow,gpu')
58
59
# Run tests with timing information
60
nbdev_test(timing=True, verbose=True)
61
62
# Test specific notebook
63
success, exec_time = test_nb('notebooks/01_core.ipynb')
64
print(f"Test completed in {exec_time:.2f}s - {'PASSED' if success else 'FAILED'}")
65
66
# Test with custom flags
67
test_nb('example.ipynb', skip_flags=['slow'], force_flags=['critical'])
68
```
69
70
## Test Cell Directives
71
72
Control test execution using special cell directives in notebooks:
73
74
### Skip Directives
75
76
- `#|skip`: Skip this cell during testing
77
- `#|slow`: Mark as slow test (can be skipped with flags='slow')
78
- `#|cuda`: Mark as requiring CUDA (can be skipped if no GPU)
79
80
### Force Directives
81
82
- `#|eval: false`: Never execute this cell
83
- `#|eval: true`: Always execute this cell
84
85
### Custom Flags
86
87
You can create custom test flags for your project:
88
89
```python
90
# In notebook cell with custom flag
91
#|my_custom_flag
92
def test_special_feature():
93
assert some_condition()
94
95
# Skip custom flagged tests
96
nbdev_test(flags='my_custom_flag')
97
```
98
99
## Test Environment
100
101
### Environment Variables
102
103
nbdev sets specific environment variables during testing:
104
105
- `IN_TEST=1`: Indicates code is running in test mode
106
- Other environment variables from project configuration
107
108
### Test Isolation
109
110
Each notebook test runs in its own execution environment:
111
112
```python
113
from nbdev.test import test_nb
114
import sys
115
116
# Tests run with clean sys.path
117
# Test failures don't affect other notebooks
118
# Each notebook gets fresh imports
119
```
120
121
## Parallel Execution
122
123
Tests run in parallel for efficiency:
124
125
```python
126
from nbdev.test import nbdev_test
127
128
# Use all CPU cores (default)
129
nbdev_test()
130
131
# Limit parallel workers
132
nbdev_test(n_workers=4)
133
134
# Disable parallel execution
135
nbdev_test(n_workers=1)
136
```
137
138
## Test Reporting
139
140
### Verbose Output
141
142
```python
143
from nbdev.test import nbdev_test
144
145
# Detailed test reporting
146
nbdev_test(verbose=True)
147
148
# Minimal output
149
nbdev_test(verbose=False)
150
151
# Show timing information
152
nbdev_test(timing=True)
153
```
154
155
### Error Handling
156
157
```python
158
from nbdev.test import test_nb
159
160
# Control error display
161
success, time = test_nb('test.ipynb', showerr=True)
162
163
# Suppress error output to stderr
164
success, time = test_nb('test.ipynb', showerr=False)
165
```
166
167
## Integration with CI/CD
168
169
nbdev testing integrates well with continuous integration:
170
171
```bash
172
# In GitHub Actions or other CI
173
nbdev_test --verbose --timing
174
175
# Skip slow tests in CI
176
nbdev_test --flags slow
177
178
# Run only fast, critical tests
179
nbdev_test --flags slow,gpu --force_flags critical
180
```
181
182
## Test Organization
183
184
### Test Structure in Notebooks
185
186
```python
187
# Cell 1: Setup
188
import numpy as np
189
from mylib import core_function
190
191
# Cell 2: Test function
192
def test_core_function():
193
result = core_function([1, 2, 3])
194
assert len(result) == 3
195
assert all(isinstance(x, int) for x in result)
196
197
# Cell 3: Run test
198
test_core_function()
199
print("✓ Core function test passed")
200
201
# Cell 4: Slow test
202
#|slow
203
def test_performance():
204
# This test takes a long time
205
large_data = generate_large_dataset()
206
result = core_function(large_data)
207
assert validate_result(result)
208
209
# Cell 5: Skip in automated testing
210
#|skip
211
def manual_test():
212
# This requires manual inspection
213
plot_results()
214
```
215
216
### Test Configuration
217
218
Configure testing behavior in `settings.ini`:
219
220
```ini
221
tst_flags = notest slow
222
```
223
224
This automatically skips cells marked with `notest` or `slow` flags.
225
226
## Best Practices
227
228
### Writing Good Tests
229
230
```python
231
# Use descriptive test functions
232
def test_data_loading_handles_missing_files():
233
"""Test that data loading gracefully handles missing files."""
234
with pytest.raises(FileNotFoundError):
235
load_data('nonexistent.csv')
236
237
# Group related tests
238
def test_preprocessing_pipeline():
239
"""Test the complete preprocessing pipeline."""
240
raw_data = create_test_data()
241
processed = preprocess(raw_data)
242
243
assert processed.shape[0] == raw_data.shape[0]
244
assert 'processed_flag' in processed.columns
245
assert processed['processed_flag'].all()
246
247
# Use appropriate flags
248
#|slow
249
def test_large_dataset_processing():
250
"""Test processing with realistic large dataset."""
251
# This test uses significant memory/time
252
pass
253
```
254
255
### Test Organization
256
257
- Keep test cells close to the code they test
258
- Use meaningful test names and docstrings
259
- Mark slow or resource-intensive tests appropriately
260
- Include both unit tests and integration tests
261
- Test edge cases and error conditions
262
263
**Complete Testing Example:**
264
265
```python
266
from nbdev.test import nbdev_test, test_nb
267
from nbdev.config import get_config
268
269
# Get project configuration
270
config = get_config()
271
272
# Run all tests with timing
273
print(f"Running tests for {config.lib_name}")
274
nbdev_test(timing=True, verbose=True)
275
276
# Run quick tests only (skip slow ones)
277
print("Running quick tests only...")
278
nbdev_test(flags='slow', timing=True)
279
```