Author: Gerardo Benitez
Date: August 2025
Version: 1.0.0
High-performance microservice for integrating external provider plans into the Fever marketplace with master-slave PostgreSQL replication, Redis caching, and async task processing.
- Ingest Service fetches XML from External Provider API
- Parses XML and publishes events to Celery via RabbitMQ
- Celery Workers process events and upsert to PostgreSQL Master
- PostgreSQL Master replicates data to Slave (streaming replication)
- FastAPI reads from Slave (performance) and writes to Master
- Redis caches frequent queries for sub-second response times
- Clients consume REST API for plan searches
- Read/Write Separation: Master-Slave PostgreSQL setup
- Caching Layer: Redis for high-performance queries
- Async Processing: Celery + RabbitMQ for background tasks
- Horizontal Scaling: Stateless services ready for load balancing
- Fault Tolerance: Service isolation and graceful degradation
- Performance: Sub-second API responses via caching strategy
FastAPI App - API REST Main (port 8000)
PostgreSQL Master - Principal DB for writing (port 5432)
PostgreSQL Slave - Replica read only (port 5433)
Redis - Cache for frequently asked questions (port 6379)
Ingest Service - Periodic synchronization with external API
Celery Worker - Asynchronous task processing
RabbitMQ - Message broker for Celery (port 5672)
Read: Client → FastAPI → Redis Cache → PostgreSQL Slave
Write: Ingest → RabbitMQ → Celery → PostgreSQL Master → Replication → Slave
Cache: Frequent queries stored in Redis with TTL
- Docker & Docker Compose
- Python 3.10+
- Git
# Clone repository
git clone <repository-url>
cd fever
# Install dependencies
make install
# Init the ingest service
make ingest
# Run the application
make run# Start all services
docker-compose up
# Start specific services
docker-compose up postgres-master postgres-slave redis
docker-compose up api ingest celeryGET /search?starts_at=2023-01-01T00:00:00&ends_at=2023-12-31T23:59:59Parameters:
starts_at(datetime): Start date filter (ISO format)ends_at(datetime): End date filter (ISO format)
Response:
{
"data": {
"events": [
{
"id": "3fa85f64-5717-4562-b3fc-2c963f66afa6",
"title": "string",
"start_date": "2025-08-26",
"start_time": "22:38:19",
"end_date": "2025-08-26",
"end_time": "14:45:15",
"min_price": 0,
"max_price": 0
}
]
},
"error": null
}GET /healthResponse:
{"status": "healthy"}- Swagger UI:
http://localhost:8000/docs - ReDoc:
http://localhost:8000/redoc
DATABASE_WRITE_URL=postgresql+asyncpg://user:password@postgres-master:5432/fever_db
DATABASE_READ_URL=postgresql+asyncpg://user:password@postgres-slave:5432/fever_db
REDIS_URL=redis://redis:6379/0
EXTERNAL_API_URL=https://provider.code-challenge.feverup.com/api/events
SYNC_INTERVAL_SECONDS=300
API_TIMEOUT_SECONDS=30Create .env file in project root with above variables adjusted for local setup.
# Run all tests
pytest
# Run specific test file
pytest tests/test_celery_worker.py -v
# Run with coverage
pytest --cov=app tests/# Test API endpoints
curl "http://localhost:8000/search?starts_at=2023-01-01T00:00:00&ends_at=2023-12-31T23:59:59"
# Test health endpoint
curl http://localhost:8000/health- API Health:
GET /health - Database: Built-in healthchecks in docker-compose
- Redis: Connection monitoring
- Celery: Worker status monitoring
- Application Logs: Structured JSON logging see logs:
docker-compose logs -f api
- Celery Logs: Task execution and error tracking see logs:
docker-compose logs -f celery
- Redis RabbitMQ: see logs:
docker-compose logs -f rabbit
- Database Logs: Query performance monitoring
- Response time monitoring
- Cache hit/miss ratios
- Database connection pool status
- Celery task queue length
- Indexes: Optimized for time-range queries
- Connection Pooling: Async connection management
- Read Replicas: Separate read/write operations
- Redis TTL: 5-minute cache for search results
- Cache Keys: Based on query parameters
- Cache Invalidation: Automatic on data updates
- Async Operations: Non-blocking I/O throughout
- Response Compression: Gzip compression enabled
- Connection Keep-Alive: HTTP/1.1 persistent connections
- Use environment-specific credentials
- Enable SSL/TLS for all connections
- Implement API rate limiting
- Set up proper firewall rules
# Scale API instances
docker-compose up --scale api=3
# Scale Celery workers
docker-compose up --scale celery=5# Backup master database
docker exec fever_postgres-master_1 pg_dump -U user fever_db > backup.sql
# Restore from backup
docker exec -i fever_postgres-master_1 psql -U user fever_db < backup.sqlapp/
├── core/ # Celery configuration
├── services/ # Business logic
├── external_api.py # External API client
├── plans_services.py # Get plans Service
├── sync_services.py # Launch Background Sync
├── models.py # Database models
├── schemas.py # Pydantic schemas
├── database.py # Database connections
├── cache.py # Redis cache service
└── main.py # FastAPI application
tests/
├── test_celery_worker.py
└── ...
- Fork the repository
- Create feature branch
- Add tests for new functionality
- Ensure all tests pass
- Submit pull request
- Follow PEP 8 style guidelines
- Add type hints
- Write comprehensive tests
- Document complex functions
Notes: This architecture is designed for high availability, scalability and optimal performance.

