An app that provides django integration for RQ (Redis Queue)
—
Django management commands for worker control, queue monitoring, and system maintenance. These commands integrate RQ functionality with Django's management system.
Commands for starting and managing RQ workers.
Start RQ workers to process jobs from specified queues.
# Basic usage
python manage.py rqworker [queue_names...]
# Process default queue
python manage.py rqworker
# Process multiple queues with priority
python manage.py rqworker high default low
# Worker options
python manage.py rqworker --burst # Exit after processing all jobs
python manage.py rqworker --worker-class path.to.Worker # Custom worker class
python manage.py rqworker --queue-class path.to.Queue # Custom queue class
python manage.py rqworker --job-class path.to.Job # Custom job class
python manage.py rqworker --sentry-dsn=DSN_URL # Sentry integrationStart RQ worker pool with multiple processes (RQ 2.10+).
# Start worker pool
python manage.py rqworker-pool [queue_names...] --num-workers 4
# Pool options
python manage.py rqworker-pool default low --num-workers 8
python manage.py rqworker-pool --queue-class path.to.Queue
python manage.py rqworker-pool --job-class path.to.JobCommands for managing job scheduling.
Start RQ scheduler daemon for processing scheduled jobs.
# Start scheduler
python manage.py rqscheduler
# Scheduler options
python manage.py rqscheduler --interval=30 # Custom polling interval
python manage.py rqscheduler --queue=high # Specific queueCommands for monitoring queue status and statistics.
Display real-time queue statistics and monitoring information.
# Show statistics once
python manage.py rqstats
# Continuous monitoring
python manage.py rqstats --interval=1 # Refresh every second
# Output formats
python manage.py rqstats --json # JSON output
python manage.py rqstats --yaml # YAML outputStatistics include:
Commands for controlling queue behavior and job processing.
Suspend workers from picking up new jobs.
# Suspend indefinitely
python manage.py rqsuspend
# Suspend for specific duration (in seconds)
python manage.py rqsuspend -d 600 # Suspend for 10 minutes
python manage.py rqsuspend --duration=3600 # Suspend for 1 hourResume suspended workers to continue processing jobs.
# Resume all suspended workers
python manage.py rqresumeCommands for enqueueing jobs from the command line.
Enqueue a job directly from the command line.
# Enqueue function
python manage.py rqenqueue myapp.tasks.my_function arg1 arg2
# With options
python manage.py rqenqueue myapp.tasks.my_function --queue=high
python manage.py rqenqueue myapp.tasks.my_function --timeout=3600# rqworker command options
class Command:
"""
Django management command for starting RQ workers.
Options:
queue_names: Queues to process (default: 'default')
--burst: Exit after processing all current jobs
--worker-class: Custom worker class path
--queue-class: Custom queue class path
--job-class: Custom job class path
--sentry-dsn: Sentry DSN for error tracking
--sentry-debug: Enable Sentry debug mode
--sentry-ca-certs: Path to CA certificates
"""# rqscheduler command options
class Command:
"""
Django management command for starting RQ scheduler.
Options:
--interval: Polling interval in seconds (default: 60)
--queue: Queue name to use (default: 'default')
"""# rqstats command options
class Command:
"""
Django management command for displaying queue statistics.
Options:
--interval: Refresh interval for continuous monitoring
--json: Output statistics in JSON format
--yaml: Output statistics in YAML format
"""Example systemd service for running RQ workers:
[Unit]
Description=Django-RQ Worker
After=network.target
[Service]
Type=simple
User=www-data
WorkingDirectory=/path/to/project
Environment=DJANGO_SETTINGS_MODULE=myproject.settings
ExecStart=/path/to/venv/bin/python manage.py rqworker high default low
Restart=always
RestartSec=3
[Install]
WantedBy=multi-user.targetExample Docker setup for RQ workers:
# Worker container
FROM python:3.11
COPY . /app
WORKDIR /app
RUN pip install -r requirements.txt
CMD ["python", "manage.py", "rqworker", "high", "default", "low"]Best practices for production deployment:
# Multiple worker processes
python manage.py rqworker high &
python manage.py rqworker default &
python manage.py rqworker low &
# Worker pool for high throughput
python manage.py rqworker-pool default --num-workers 8 &
# Scheduler process
python manage.py rqscheduler &
# Monitoring process
python manage.py rqstats --interval=60 --json > /var/log/rq-stats.log &Commands respect Django settings configuration:
# settings.py
RQ_QUEUES = {
'high': {'HOST': 'redis-high', 'PORT': 6379, 'DB': 0},
'default': {'HOST': 'redis-default', 'PORT': 6379, 'DB': 0},
'low': {'HOST': 'redis-low', 'PORT': 6379, 'DB': 0},
}
RQ = {
'WORKER_CLASS': 'myapp.workers.CustomWorker',
'JOB_CLASS': 'myapp.jobs.CustomJob',
'DEFAULT_RESULT_TTL': 500,
}Commands integrate with Django's logging system:
# settings.py
LOGGING = {
'version': 1,
'handlers': {
'rq_console': {
'level': 'DEBUG',
'class': 'rq.logutils.ColorizingStreamHandler',
},
},
'loggers': {
'rq.worker': {
'handlers': ['rq_console'],
'level': 'DEBUG'
},
}
}Commands handle system signals gracefully:
SIGTERM: Graceful shutdown after current job completionSIGINT: Immediate shutdown (Ctrl+C)SIGUSR1: Force job failure and continueSIGUSR2: Suspend/resume workerMonitor worker and scheduler health:
# Check worker status
python manage.py rqstats --json | jq '.queues[].workers'
# Check scheduler status
python manage.py rqstats --json | jq '.schedulers'
# Monitor job processing rates
python manage.py rqstats --interval=5Common issues and solutions:
Install with Tessl CLI
npx tessl i tessl/pypi-django-rq