CtrlK
BlogDocsLog inGet started
Tessl Logo

tessl/pypi-pyrate-limiter

Python Rate-Limiter using Leaky-Bucket Algorithm for controlling request rates in applications with multiple backend storage options.

81

1.44x
Overview
Eval results
Files

storage-backends.mddocs/

Storage Backends

Multiple storage backends for persisting rate limit state across application restarts, processes, and distributed deployments. Each backend provides the same interface while offering different persistence and scalability characteristics.

Capabilities

In-Memory Bucket

Fast, thread-safe bucket using native Python lists for simple applications.

class InMemoryBucket(AbstractBucket):
    def __init__(self, rates: List[Rate]):
        """
        Initialize in-memory bucket with rate configurations.
        
        Parameters:
        - rates: List of rate limit configurations
        
        Characteristics:
        - Fast and precise
        - Not persistent across restarts
        - Not scalable across processes
        - Suitable for single-process applications
        """

Usage example:

from pyrate_limiter import InMemoryBucket, Rate, Duration, Limiter

# Create in-memory bucket
rates = [Rate(10, Duration.SECOND), Rate(100, Duration.MINUTE)]
bucket = InMemoryBucket(rates)

# Use with limiter
limiter = Limiter(bucket)

SQLite Bucket

Persistent, file-based bucket using SQLite for cross-process rate limiting.

class SQLiteBucket(AbstractBucket):
    def __init__(
        self, 
        rates: List[Rate], 
        conn: sqlite3.Connection, 
        table: str, 
        lock=None
    ):
        """
        Initialize SQLite bucket with database connection.
        
        Parameters:
        - rates: List of rate limit configurations
        - conn: SQLite database connection
        - table: Table name for storing rate data
        - lock: Optional lock for thread safety
        """
    
    @classmethod
    def init_from_file(
        cls,
        rates: List[Rate],
        table: str = "rate_bucket",
        db_path: Optional[str] = None,
        create_new_table: bool = True,
        use_file_lock: bool = False
    ) -> "SQLiteBucket":
        """
        Create SQLite bucket from file path.
        
        Parameters:
        - rates: List of rate limit configurations
        - table: Table name for rate data (default: "rate_bucket")
        - db_path: Path to SQLite database file (None for temporary file)
        - create_new_table: Create table if it doesn't exist (default: True)
        - use_file_lock: Enable file locking for multiprocessing
        
        Returns:
        - SQLiteBucket: Configured bucket instance
        """

Usage example:

from pyrate_limiter import SQLiteBucket, Rate, Duration, Limiter
import sqlite3

# Method 1: Direct connection
conn = sqlite3.connect("rate_limits.db")
bucket = SQLiteBucket([Rate(10, Duration.SECOND)], conn, "api_limits")

# Method 2: From file (recommended)
bucket = SQLiteBucket.init_from_file(
    rates=[Rate(5, Duration.SECOND)],
    table="user_limits",
    db_path="rate_limits.db",
    create_new_table=True,
    use_file_lock=True  # For multiprocessing
)

limiter = Limiter(bucket)

Redis Bucket

Distributed bucket using Redis for scalable, cross-instance rate limiting.

class RedisBucket(AbstractBucket):
    def __init__(
        self,
        rates: List[Rate],
        redis: Union[Redis, AsyncRedis],
        bucket_key: str,
        script_hash: str
    ):
        """
        Initialize Redis bucket with Redis client.
        
        Parameters:
        - rates: List of rate limit configurations
        - redis: Redis client (sync or async)
        - bucket_key: Key prefix for Redis operations
        - script_hash: Hash of the loaded Lua script
        
        Note: Use the init() class method for normal initialization.
        """
    
    @classmethod
    def init(
        cls,
        rates: List[Rate],
        redis: Union[Redis, AsyncRedis],
        bucket_key: str
    ):
        """
        Create Redis bucket with automatic Lua script loading.
        
        Parameters:
        - rates: List of rate limit configurations
        - redis: Redis client (sync or async)
        - bucket_key: Key prefix for Redis operations
        
        Returns:
        - RedisBucket: Configured bucket instance (or awaitable for async)
        
        Characteristics:
        - Distributed and scalable
        - Supports both sync and async operations
        - Requires Redis server
        - Atomic operations using Lua scripts
        """

Usage example:

from pyrate_limiter import RedisBucket, Rate, Duration, Limiter
import redis

# Sync Redis
redis_client = redis.Redis(host='localhost', port=6379, db=0)
bucket = RedisBucket.init(
    rates=[Rate(100, Duration.MINUTE)],
    redis=redis_client,
    bucket_key="api_rate_limits"
)

# Async Redis
import redis.asyncio as aioredis

async def async_example():
    redis_client = await aioredis.from_url("redis://localhost")
    bucket = await RedisBucket.init(
        rates=[Rate(50, Duration.SECOND)],
        redis=redis_client,
        bucket_key="async_limits"
    )
    limiter = Limiter(bucket)
    
    success = await limiter.try_acquire_async("user123")
    await redis_client.close()

PostgreSQL Bucket

Enterprise-grade bucket using PostgreSQL for high-performance distributed rate limiting.

class PostgresBucket(AbstractBucket):
    def __init__(
        self, 
        pool: ConnectionPool, 
        table: str, 
        rates: List[Rate]
    ):
        """
        Initialize PostgreSQL bucket with connection pool.
        
        Parameters:
        - pool: PostgreSQL connection pool
        - table: Table name for rate data
        - rates: List of rate limit configurations
        
        Characteristics:
        - High performance and reliability
        - ACID compliance
        - Supports connection pooling
        - Suitable for enterprise applications
        """

Usage example:

from pyrate_limiter import PostgresBucket, Rate, Duration, Limiter
from psycopg_pool import ConnectionPool

# Create connection pool
pool = ConnectionPool(
    "host=localhost dbname=mydb user=myuser password=mypass",
    min_size=1,
    max_size=10
)

bucket = PostgresBucket(
    pool=pool,
    table="rate_limits",
    rates=[Rate(1000, Duration.HOUR)]
)

limiter = Limiter(bucket)

# Don't forget to close the pool when done
# pool.close()

Multiprocess Bucket

Wrapper bucket for multiprocessing environments using file-based locking.

class MultiprocessBucket(AbstractBucket):
    def __init__(self, rates: List[Rate], items: List[RateItem], mp_lock: LockType):
        """
        Initialize multiprocess-safe bucket.
        
        Parameters:
        - rates: List of rate limit configurations
        - items: Shared list proxy for storing rate items
        - mp_lock: Multiprocessing lock for synchronization
        
        Note: Use the init() class method for normal initialization.
        """
    
    @classmethod
    def init(cls, rates: List[Rate]):
        """
        Create multiprocess bucket with shared memory.
        
        Parameters:
        - rates: List of rate limit configurations
        
        Returns:
        - MultiprocessBucket: Configured bucket with shared state
        
        Characteristics:
        - Safe across multiple processes
        - Uses multiprocessing.Manager for shared state
        - Built on InMemoryBucket with process synchronization
        """

Usage example:

from pyrate_limiter import MultiprocessBucket, Rate, Duration, Limiter

# For multiprocessing environments
bucket = MultiprocessBucket.init(
    rates=[Rate(20, Duration.SECOND)]
)

limiter = Limiter(bucket)

Abstract Bucket Interface

All buckets implement the same interface for consistent behavior.

class AbstractBucket(ABC):
    rates: List[Rate]
    failing_rate: Optional[Rate]
    
    def put(self, item: RateItem) -> Union[bool, Awaitable[bool]]:
        """Put an item in the bucket, return True if successful."""
    
    def leak(self, current_timestamp: Optional[int] = None) -> Union[int, Awaitable[int]]:
        """Remove outdated items from bucket."""
    
    def flush(self) -> Union[None, Awaitable[None]]:
        """Flush the entire bucket."""
    
    def count(self) -> Union[int, Awaitable[int]]:
        """Count number of items in bucket."""
    
    def peek(self, index: int) -> Union[Optional[RateItem], Awaitable[Optional[RateItem]]]:
        """Peek at item at specific index."""
    
    def waiting(self, item: RateItem) -> Union[int, Awaitable[int]]:
        """Calculate time until bucket becomes available."""
    
    def limiter_lock(self) -> Optional[object]:
        """Additional lock for multiprocessing environments."""
    
    def close(self) -> None:
        """Release resources held by bucket."""
    
    def __enter__(self):
        """Enter context manager."""
    
    def __exit__(self, exc_type, exc, tb) -> None:
        """Exit context manager and cleanup resources."""

Bucket Wrapper

For converting synchronous buckets to asynchronous interface.

class BucketAsyncWrapper(AbstractBucket):
    def __init__(self, bucket: AbstractBucket):
        """
        Wrap any bucket to provide async interface.
        
        Parameters:
        - bucket: Synchronous bucket to wrap
        """

Usage example:

from pyrate_limiter import BucketAsyncWrapper, InMemoryBucket, Rate, Duration

# Wrap sync bucket for async usage
sync_bucket = InMemoryBucket([Rate(10, Duration.SECOND)])
async_bucket = BucketAsyncWrapper(sync_bucket)

# Use in async context
async def async_rate_limiting():
    success = await async_bucket.put(RateItem("user123", 1640995200000))
    count = await async_bucket.count()

Choosing a Storage Backend

  • InMemoryBucket: Single-process applications, temporary rate limiting
  • SQLiteBucket: Cross-process applications, persistent rate limiting, moderate scale
  • RedisBucket: Distributed applications, high scale, shared rate limiting
  • PostgresBucket: Enterprise applications, ACID compliance, complex queries
  • MultiprocessBucket: Multiprocessing applications with file-based coordination

Install with Tessl CLI

npx tessl i tessl/pypi-pyrate-limiter

docs

core-limiting.md

factory-patterns.md

index.md

rate-configuration.md

storage-backends.md

time-sources.md

tile.json