or run

npx @tessl/cli init
Log in

Version

Tile

Overview

Evals

Files

Files

docs

core-limiting.mdfactory-patterns.mdindex.mdrate-configuration.mdstorage-backends.mdtime-sources.md

factory-patterns.mddocs/

0

# Factory Patterns

1

2

Pre-configured factory functions and patterns for common use cases including multiprocessing support and simplified bucket creation. These functions provide convenient ways to create limiters without manually configuring buckets and clocks.

3

4

## Capabilities

5

6

### SQLite Limiter Factory

7

8

Create persistent limiters using SQLite storage with common configurations.

9

10

```python { .api }

11

def create_sqlite_limiter(

12

rate_per_duration: int = 3,

13

duration: Union[int, Duration] = Duration.SECOND,

14

db_path: Optional[str] = None,

15

table_name: str = "rate_bucket",

16

max_delay: Union[int, Duration] = Duration.DAY,

17

buffer_ms: int = 50,

18

use_file_lock: bool = False,

19

async_wrapper: bool = False

20

) -> Limiter:

21

"""

22

Create a SQLite-backed rate limiter with configurable rate, persistence, and optional async support.

23

24

Parameters:

25

- rate_per_duration: Number of allowed requests per duration (default: 3)

26

- duration: Time window for the rate limit (default: Duration.SECOND)

27

- db_path: Path to the SQLite database file (None for temporary file)

28

- table_name: Name of the table used for rate buckets (default: "rate_bucket")

29

- max_delay: Maximum delay before failing requests (default: Duration.DAY)

30

- buffer_ms: Extra wait time in milliseconds to account for clock drift (default: 50)

31

- use_file_lock: Enable file locking for multi-process synchronization (default: False)

32

- async_wrapper: Whether to wrap the bucket for async usage (default: False)

33

34

Returns:

35

- Limiter: Configured SQLite-backed limiter instance

36

"""

37

```

38

39

Usage example:

40

41

```python

42

from pyrate_limiter import create_sqlite_limiter, Duration

43

44

# Basic SQLite limiter

45

limiter1 = create_sqlite_limiter(5, Duration.SECOND)

46

47

# Persistent SQLite limiter with custom database

48

limiter2 = create_sqlite_limiter(

49

rate_per_duration=100,

50

duration=Duration.MINUTE,

51

db_path="/var/lib/myapp/rate_limits.db",

52

table_name="api_limits",

53

use_file_lock=True # For multiprocessing

54

)

55

56

# Async SQLite limiter

57

limiter3 = create_sqlite_limiter(

58

rate_per_duration=10,

59

duration=Duration.SECOND,

60

db_path="rate_limits.db",

61

max_delay=Duration.SECOND * 5,

62

async_wrapper=True

63

)

64

```

65

66

### In-Memory Limiter Factory

67

68

Create fast in-memory limiters for simple applications.

69

70

```python { .api }

71

def create_inmemory_limiter(

72

rate_per_duration: int = 3,

73

duration: Union[int, Duration] = Duration.SECOND,

74

max_delay: Union[int, Duration] = Duration.DAY,

75

buffer_ms: int = 50,

76

async_wrapper: bool = False

77

) -> Limiter:

78

"""

79

Create an in-memory rate limiter with configurable rate, duration, delay, and optional async support.

80

81

Parameters:

82

- rate_per_duration: Number of allowed requests per duration (default: 3)

83

- duration: Time window for the rate limit (default: Duration.SECOND)

84

- max_delay: Maximum delay before failing requests (default: Duration.DAY)

85

- buffer_ms: Extra wait time in milliseconds to account for clock drift (default: 50)

86

- async_wrapper: Whether to wrap the bucket for async usage (default: False)

87

88

Returns:

89

- Limiter: Configured in-memory limiter instance

90

"""

91

```

92

93

Usage example:

94

95

```python

96

from pyrate_limiter import create_inmemory_limiter, Duration

97

98

# Basic in-memory limiter

99

limiter1 = create_inmemory_limiter() # 3 requests per second

100

101

# Custom rate limiter

102

limiter2 = create_inmemory_limiter(

103

rate_per_duration=20,

104

duration=Duration.MINUTE,

105

max_delay=Duration.SECOND * 10

106

)

107

108

# Async in-memory limiter

109

limiter3 = create_inmemory_limiter(

110

rate_per_duration=50,

111

duration=Duration.SECOND,

112

async_wrapper=True

113

)

114

```

115

116

### SQLite Bucket Factory

117

118

Create SQLite buckets directly for custom limiter configurations.

119

120

```python { .api }

121

def create_sqlite_bucket(

122

rates: List[Rate],

123

db_path: Optional[str],

124

table_name: str = "pyrate_limiter",

125

use_file_lock: bool = False

126

):

127

"""

128

Create and initialize a SQLite bucket for rate limiting.

129

130

Parameters:

131

- rates: List of rate limit configurations

132

- db_path: Path to the SQLite database file (or in-memory if None)

133

- table_name: Name of the table to store rate bucket data (default: "pyrate_limiter")

134

- use_file_lock: Enable file locking for multi-process synchronization (default: False)

135

136

Returns:

137

- SQLiteBucket: Initialized SQLite-backed bucket

138

"""

139

```

140

141

Usage example:

142

143

```python

144

from pyrate_limiter import create_sqlite_bucket, Rate, Duration, Limiter

145

146

# Create bucket with multiple rates

147

rates = [

148

Rate(10, Duration.SECOND),

149

Rate(100, Duration.MINUTE),

150

Rate(1000, Duration.HOUR)

151

]

152

153

bucket = create_sqlite_bucket(

154

rates=rates,

155

db_path="complex_limits.db",

156

table_name="user_limits",

157

use_file_lock=True

158

)

159

160

# Use with custom limiter configuration

161

limiter = Limiter(

162

bucket,

163

max_delay=Duration.SECOND * 10,

164

buffer_ms=100

165

)

166

```

167

168

## Multiprocessing Support

169

170

### Global Limiter Pattern

171

172

Initialize a global limiter for multiprocessing scenarios.

173

174

```python { .api }

175

LIMITER: Optional[Limiter] = None

176

177

def init_global_limiter(

178

bucket: AbstractBucket,

179

max_delay: Union[int, Duration] = Duration.HOUR,

180

raise_when_fail: bool = False,

181

retry_until_max_delay: bool = True,

182

buffer_ms: int = 50

183

) -> None:

184

"""

185

Initialize a global Limiter instance using the provided bucket.

186

187

Intended for use as an initializer for ProcessPoolExecutor.

188

189

Parameters:

190

- bucket: The rate-limiting bucket to be used

191

- max_delay: Maximum delay before failing requests (default: Duration.HOUR)

192

- raise_when_fail: Whether to raise an exception when a request fails (default: False)

193

- retry_until_max_delay: Retry until the maximum delay is reached (default: True)

194

- buffer_ms: Additional buffer time in milliseconds for retries (default: 50)

195

"""

196

```

197

198

Usage example:

199

200

```python

201

from pyrate_limiter import init_global_limiter, create_sqlite_bucket, LIMITER

202

from pyrate_limiter import Rate, Duration

203

from concurrent.futures import ProcessPoolExecutor

204

import multiprocessing as mp

205

206

def worker_init():

207

"""Initialize worker process with shared rate limiter."""

208

bucket = create_sqlite_bucket(

209

rates=[Rate(10, Duration.SECOND)],

210

db_path="/tmp/mp_rate_limits.db",

211

use_file_lock=True

212

)

213

init_global_limiter(bucket)

214

215

def worker_task(user_id):

216

"""Worker function using global rate limiter."""

217

if LIMITER and LIMITER.try_acquire(f"user_{user_id}"):

218

return f"Processed task for user {user_id}"

219

else:

220

return f"Rate limited for user {user_id}"

221

222

# Use with ProcessPoolExecutor

223

if __name__ == "__main__":

224

with ProcessPoolExecutor(

225

max_workers=4,

226

initializer=worker_init

227

) as executor:

228

futures = [executor.submit(worker_task, i) for i in range(20)]

229

results = [f.result() for f in futures]

230

print(results)

231

```

232

233

### Multiprocess Bucket Pattern

234

235

Using MultiprocessBucket for cross-process rate limiting.

236

237

```python

238

from pyrate_limiter import MultiprocessBucket, Limiter, Rate, Duration

239

from concurrent.futures import ProcessPoolExecutor

240

241

def create_mp_limiter():

242

"""Create multiprocess-safe limiter."""

243

bucket = MultiprocessBucket.init(

244

rates=[Rate(5, Duration.SECOND)]

245

)

246

return Limiter(bucket)

247

248

def worker_function(data):

249

"""Worker function with individual limiter."""

250

limiter = create_mp_limiter()

251

252

if limiter.try_acquire(f"worker_{data}"):

253

# Process data

254

return f"Processed {data}"

255

else:

256

return f"Rate limited {data}"

257

258

# Each process creates its own limiter with shared storage

259

with ProcessPoolExecutor(max_workers=3) as executor:

260

results = list(executor.map(worker_function, range(10)))

261

print(results)

262

```

263

264

## Factory Function Comparison

265

266

| Function | Storage | Async Support | Multiprocessing | Use Case |

267

|----------|---------|---------------|----------------|----------|

268

| `create_inmemory_limiter` | In-memory | Optional wrapper | No | Single process, fast access |

269

| `create_sqlite_limiter` | SQLite file | Optional wrapper | With file lock | Persistent, cross-process |

270

| `create_sqlite_bucket` | SQLite file | No | With file lock | Custom limiter configs |

271

| `init_global_limiter` | Any bucket | Depends on bucket | Yes | Process pool patterns |

272

273

## Factory Pattern Benefits

274

275

Factory functions provide several advantages:

276

277

1. **Simplified Configuration**: Reduce boilerplate code for common scenarios

278

2. **Best Practices**: Encode recommended configurations and patterns

279

3. **Consistency**: Ensure consistent setup across applications

280

4. **Extensibility**: Allow customization while providing sensible defaults

281

5. **Multiprocessing**: Handle complex multiprocessing scenarios with shared state