Python Rate-Limiter using Leaky-Bucket Algorithm for controlling request rates in applications with multiple backend storage options.
81
Build a distributed rate limiting system for a multi-instance API service that shares rate limit state across all server instances.
Your system should implement a rate limiter that:
Create a module with the following functionality:
The rate limiter should:
@generates
def check_rate_limit(user_id: str) -> tuple[bool, int]:
"""
Check if a request from the user should be allowed.
Parameters:
- user_id: Unique identifier for the user
Returns:
- Tuple of (allowed: bool, wait_time_ms: int)
- allowed: True if request is allowed, False otherwise
- wait_time_ms: Time to wait in milliseconds before retrying (0 if allowed)
"""
async def check_rate_limit_async(user_id: str) -> tuple[bool, int]:
"""
Asynchronously check if a request from the user should be allowed.
Parameters:
- user_id: Unique identifier for the user
Returns:
- Tuple of (allowed: bool, wait_time_ms: int)
- allowed: True if request is allowed, False otherwise
- wait_time_ms: Time to wait in milliseconds before retrying (0 if allowed)
"""
def reset_user_limit(user_id: str) -> None:
"""
Reset rate limit state for a specific user.
Parameters:
- user_id: Unique identifier for the user
"""
def get_user_count(user_id: str) -> int:
"""
Get the current request count for a user within the current time window.
Parameters:
- user_id: Unique identifier for the user
Returns:
- Current count of requests for the user
"""Provides distributed rate limiting capabilities with Redis backend support.
Python Redis client for connecting to Redis server.
Install with Tessl CLI
npx tessl i tessl/pypi-pyrate-limiterdocs
evals
scenario-1
scenario-2
scenario-3
scenario-4
scenario-5
scenario-6
scenario-7
scenario-8
scenario-9
scenario-10