-
Redis Server - Required for Celery message broker
# Install Redis (Ubuntu/Debian) sudo apt install redis-server # Install Redis (macOS) brew install redis # Start Redis redis-server
-
Environment Setup
# Copy environment template cp .env.example .env # Add Redis URL to .env echo "REDIS_URL=redis://localhost:6379/0" >> .env
python manage.py runservercelery -A citis worker --loglevel=infocelery -A citis beat --loglevel=info --scheduler django_celery_beat.schedulers:DatabaseScheduler# Archive worker (CPU intensive)
celery -A citis worker -Q archive --loglevel=info --concurrency=2
# Assets worker (I/O intensive)
celery -A citis worker -Q assets --loglevel=info --concurrency=4
# Analytics worker (lightweight)
celery -A citis worker -Q analytics --loglevel=info --concurrency=2# Start general worker
python manage_celery.py worker
# Start archive-specific worker
python manage_celery.py worker --queue archive --concurrency 2
# Start beat scheduler
python manage_celery.py beat
# Monitor with Flower (install with: pip install flower)
python manage_celery.py flower# Install Flower
pip install flower
# Start monitoring (available at http://localhost:5555)
celery -A citis flower# List active tasks
celery -A citis inspect active
# Worker statistics
celery -A citis inspect stats
# Registered tasks
celery -A citis inspect registered- User creates archive →
Shortcodecreated immediately archive_url_tasktriggered asynchronously → Archives URLextract_assets_tasktriggered after 30s → Screenshots, PDFs, favicons- User gets immediate response, archiving happens in background
- User visits shortcode →
Visitrecord created immediately update_visit_analytics_tasktriggered → GeoIP lookup- Page loads immediately, analytics processed in background
# Check Redis connection
redis-cli ping
# Purge all tasks
celery -A citis purge
# Restart workers
pkill -f "celery worker"
python manage_celery.py worker# Monitor failed tasks in Django admin
# Visit: http://localhost:8000/admin/django_celery_results/taskresult/
# Check logs
tail -f logs/citis.log# Limit worker memory (restart after 100 tasks)
celery -A citis worker --max-tasks-per-child=100
# Monitor memory usage
celery -A citis eventsAdd to your .env file:
# Required for Celery
REDIS_URL=redis://localhost:6379/0
# Optional: Celery monitoring
CELERY_FLOWER_PASSWORD=your-flower-passwordConfigure health monitoring for different plan tiers:
# Setup health monitoring tasks (run once)
python manage.py setup_health_monitoring
# Reset and recreate all tasks if needed
python manage.py setup_health_monitoring --resetThis creates the following monitoring schedule:
- Free Plan: Daily link health checks (02:00 AM)
- Professional Plan: Link health every 5 minutes + content integrity hourly
- Sovereign Plan: Link health every minute + content integrity every 5 minutes
# Terminal 4: Start the periodic task scheduler
python manage_celery.py beatCheck the status of your monitoring tasks:
# View current periodic tasks
python manage.py setup_health_monitoring
# Check task execution in admin interface
# Visit: http://localhost:8000/admin/django_celery_results/taskresult/You can also trigger health checks manually:
# Check specific shortcode health
python manage.py shell
>>> from archive.tasks import check_link_health_task
>>> check_link_health_task.delay('ABC123')
# Run content integrity scan
>>> from archive.tasks import content_integrity_scan_task
>>> content_integrity_scan_task.delay('ABC123')