0
# pytest-codspeed
1
2
A pytest plugin that enables performance benchmarking integrated with the CodSpeed continuous benchmarking platform. It provides both marker-based and fixture-based approaches for measuring performance, supporting both wall-clock time and instruction-based measurement.
3
4
## Package Information
5
6
- **Package Name**: pytest-codspeed
7
- **Language**: Python
8
- **Installation**: `pip install pytest-codspeed`
9
- **Supported Python**: 3.9+
10
11
## Core Imports
12
13
```python
14
import pytest
15
from pytest_codspeed import BenchmarkFixture
16
```
17
18
The plugin automatically registers with pytest when installed, making the `benchmark` fixture and markers available.
19
20
## Basic Usage
21
22
```python
23
import pytest
24
25
# Approach 1: Measure entire test function with marker
26
@pytest.mark.benchmark
27
def test_sum_performance():
28
data = list(range(1000))
29
result = sum(x * x for x in data)
30
assert result == 332833500
31
32
# Approach 2: Measure specific code with fixture
33
def test_calculation_performance(benchmark):
34
data = list(range(1000))
35
36
def calculate():
37
return sum(x * x for x in data)
38
39
result = benchmark(calculate)
40
assert result == 332833500
41
```
42
43
## Architecture
44
45
pytest-codspeed integrates deeply with pytest through a comprehensive plugin system:
46
47
- **Plugin Hooks**: Customizes pytest behavior to filter and execute benchmarks
48
- **Measurement Instruments**: Pluggable backends for walltime vs instruction-based measurement
49
- **Fixture System**: Provides `benchmark` and `codspeed_benchmark` fixtures with advanced measurement options
50
- **CLI Integration**: Adds `--codspeed` flag and configuration options
51
- **Results Reporting**: Rich terminal output and JSON export for CI/CD integration
52
53
The plugin automatically disables other benchmark plugins (pytest-benchmark, pytest-speed) when active to prevent conflicts.
54
55
## Capabilities
56
57
### Basic Benchmarking
58
59
Core benchmarking functionality using pytest markers and fixtures. Provides simple decorator-based and fixture-based approaches to measure function performance.
60
61
```python { .api }
62
# Pytest markers
63
@pytest.mark.benchmark
64
@pytest.mark.codspeed_benchmark
65
66
# Fixture function
67
def benchmark(target: Callable, *args, **kwargs) -> Any: ...
68
```
69
70
[Basic Benchmarking](./basic-benchmarking.md)
71
72
### Configuration Options
73
74
Command-line options and marker parameters for controlling benchmark behavior, including time limits, warmup periods, and measurement modes.
75
76
```python { .api }
77
# CLI options: --codspeed, --codspeed-mode, --codspeed-warmup-time, etc.
78
# Marker options: group, min_time, max_time, max_rounds
79
80
@pytest.mark.benchmark(group="math", max_time=5.0, max_rounds=100)
81
def test_with_options(): ...
82
```
83
84
[Configuration](./configuration.md)
85
86
### Advanced Benchmarking
87
88
Precise measurement control with setup/teardown functions, multiple rounds, and custom iterations for complex benchmarking scenarios.
89
90
```python { .api }
91
def benchmark.pedantic(
92
target: Callable,
93
args: tuple = (),
94
kwargs: dict = {},
95
setup: Callable = None,
96
teardown: Callable = None,
97
rounds: int = 1,
98
warmup_rounds: int = 0,
99
iterations: int = 1
100
) -> Any: ...
101
```
102
103
[Advanced Benchmarking](./advanced-benchmarking.md)
104
105
## Environment Integration
106
107
pytest-codspeed automatically detects CodSpeed CI environments via the `CODSPEED_ENV` environment variable. When running locally, use the `--codspeed` flag to enable benchmarking:
108
109
```bash
110
pytest tests/ --codspeed
111
```
112
113
## Types
114
115
```python { .api }
116
class BenchmarkFixture:
117
"""Main benchmark fixture class providing measurement functionality."""
118
119
def __call__(self, target: Callable, *args, **kwargs) -> Any:
120
"""Execute and measure target function."""
121
...
122
123
def pedantic(
124
self,
125
target: Callable,
126
args: tuple = (),
127
kwargs: dict = {},
128
setup: Callable | None = None,
129
teardown: Callable | None = None,
130
rounds: int = 1,
131
warmup_rounds: int = 0,
132
iterations: int = 1
133
) -> Any:
134
"""Advanced benchmarking with setup/teardown control."""
135
...
136
```