|
| 1 | +# Lambda Python Layer Builder — Infrastructure |
| 2 | + |
| 3 | +Serverless infrastructure that builds AWS Lambda Python layers on-demand using EC2 Spot instances and Docker, with a GitHub Pages frontend. |
| 4 | + |
| 5 | +## Architecture |
| 6 | + |
| 7 | +``` |
| 8 | +┌──────────────────────────────────────────────────────────────────────┐ |
| 9 | +│ GitHub Pages (docs/index.html) │ |
| 10 | +│ ┌────────────────────────────────────────────────────────────┐ │ |
| 11 | +│ │ requirements.txt │ Python version │ Architecture │ Submit │ │ |
| 12 | +│ └─────────────────────────┬──────────────────────────────────┘ │ |
| 13 | +└────────────────────────────┼─────────────────────────────────────────┘ |
| 14 | + │ POST /builds |
| 15 | + ▼ |
| 16 | +┌──────────────────────────────────────────────────────────────────────┐ |
| 17 | +│ API Gateway (HTTP API) │ |
| 18 | +│ POST /builds → submit_build Lambda │ |
| 19 | +│ GET /builds/{id} → check_status Lambda │ |
| 20 | +└───────────┬──────────────────────────────────────┬───────────────────┘ |
| 21 | + │ │ |
| 22 | + ▼ ▼ |
| 23 | +┌───────────────────┐ ┌───────────────────────┐ |
| 24 | +│ submit_build λ │ │ check_status λ │ |
| 25 | +│ • Validates input│ │ • Reads DynamoDB │ |
| 26 | +│ • Creates record │ │ • Generates presigned│ |
| 27 | +│ • Sends to SQS │ │ S3 download URLs │ |
| 28 | +└─────────┬─────────┘ └───────────┬───────────┘ |
| 29 | + │ │ |
| 30 | + ▼ ▼ |
| 31 | +┌───────────────────┐ ┌───────────────────────┐ |
| 32 | +│ SQS Build Queue │ │ DynamoDB │ |
| 33 | +│ (with DLQ) │ │ buildId | status │ |
| 34 | +└─────────┬─────────┘ │ s3_keys | TTL │ |
| 35 | + │ └───────────────────────┘ |
| 36 | + ▼ ▲ |
| 37 | +┌───────────────────┐ │ |
| 38 | +│ process_build λ │ │ |
| 39 | +│ • Launches EC2 │ │ |
| 40 | +│ Spot instance │ │ |
| 41 | +└─────────┬─────────┘ │ |
| 42 | + │ │ |
| 43 | + ▼ │ |
| 44 | +┌──────────────────────────────────────────────────┼───────────────────┐ |
| 45 | +│ EC2 Spot Instance │ │ |
| 46 | +│ ┌─────────────────────────────────┐ │ │ |
| 47 | +│ │ 1. Install Docker │ │ │ |
| 48 | +│ │ 2. Pull/build Docker image │ │ │ |
| 49 | +│ │ 3. Run container to build │ │ │ |
| 50 | +│ │ Lambda layer zip files │ │ │ |
| 51 | +│ │ 4. Upload zips to S3 ─────────┼──┐ │ │ |
| 52 | +│ │ 5. Update DynamoDB status ─────┼──┼──────────┘ │ |
| 53 | +│ │ 6. Self-terminate │ │ │ |
| 54 | +│ └─────────────────────────────────┘ │ │ |
| 55 | +└───────────────────────────────────────┼──────────────────────────────┘ |
| 56 | + │ |
| 57 | + ▼ |
| 58 | + ┌───────────────────┐ |
| 59 | + │ S3 Artifacts │ |
| 60 | + │ builds/{id}/*.zip │ |
| 61 | + │ Lifecycle: 24h │ |
| 62 | + └───────────────────┘ |
| 63 | +``` |
| 64 | + |
| 65 | +## Flow |
| 66 | + |
| 67 | +1. **User** opens GitHub Pages, enters `requirements.txt`, selects Python version & architecture |
| 68 | +2. **API Gateway** routes `POST /builds` to `submit_build` Lambda |
| 69 | +3. **submit_build** validates input, creates DynamoDB record (QUEUED), sends SQS message |
| 70 | +4. **SQS** triggers `process_build` Lambda |
| 71 | +5. **process_build** launches an EC2 Spot instance with a user-data script |
| 72 | +6. **EC2 instance** installs Docker, pulls pre-built images from GHCR (or builds from Dockerfile), runs the build, uploads zips to S3, updates DynamoDB (COMPLETED), self-terminates |
| 73 | +7. **User** frontend polls `GET /builds/{id}` which returns status + presigned S3 download URLs |
| 74 | +8. **Artifacts** auto-expire from S3 after configurable TTL (default 24h) |
| 75 | + |
| 76 | +## Cost Estimate |
| 77 | + |
| 78 | +| Component | Cost | Notes | |
| 79 | +|-----------|------|-------| |
| 80 | +| EC2 Spot (c5.xlarge) | ~$0.04/hr | ~$0.01 per build (15 min avg) | |
| 81 | +| S3 | ~$0.023/GB/month | Artifacts auto-expire | |
| 82 | +| Lambda | ~$0.20/1M requests | Minimal usage | |
| 83 | +| API Gateway | $1.00/1M requests | HTTP API pricing | |
| 84 | +| DynamoDB | Pay-per-request | ~$0.00 for low volume | |
| 85 | +| SQS | $0.40/1M messages | Negligible | |
| 86 | +| **Total (idle)** | **~$0/month** | No running infrastructure | |
| 87 | +| **Per build** | **~$0.01-0.03** | Spot instance + S3 | |
| 88 | + |
| 89 | +## Prerequisites |
| 90 | + |
| 91 | +- AWS account with permissions to create VPC, EC2, Lambda, S3, SQS, DynamoDB, API Gateway, IAM |
| 92 | +- [Terraform](https://www.terraform.io/downloads) >= 1.5.0 |
| 93 | +- AWS CLI configured (`aws configure`) |
| 94 | + |
| 95 | +## Deployment |
| 96 | + |
| 97 | +```bash |
| 98 | +cd infrastructure/terraform |
| 99 | + |
| 100 | +# Copy and customize configuration |
| 101 | +cp terraform.tfvars.example terraform.tfvars |
| 102 | +# Edit terraform.tfvars with your preferences |
| 103 | + |
| 104 | +# Initialize and deploy |
| 105 | +terraform init |
| 106 | +terraform plan |
| 107 | +terraform apply |
| 108 | +``` |
| 109 | + |
| 110 | +After deployment, note the `api_url` output: |
| 111 | + |
| 112 | +``` |
| 113 | +Outputs: |
| 114 | + api_url = "https://xxxxxxxxxx.execute-api.eu-central-1.amazonaws.com" |
| 115 | +``` |
| 116 | + |
| 117 | +### Configure GitHub Pages |
| 118 | + |
| 119 | +1. In your GitHub repository: **Settings → Pages → Source: Deploy from a branch** |
| 120 | +2. Select **Branch: main**, **Folder: /docs** |
| 121 | +3. Open your GitHub Pages URL |
| 122 | +4. Click **⚙ API Settings** and paste the `api_url` from Terraform output |
| 123 | +5. Start building layers! |
| 124 | + |
| 125 | +## Configuration |
| 126 | + |
| 127 | +| Variable | Default | Description | |
| 128 | +|----------|---------|-------------| |
| 129 | +| `aws_region` | `eu-central-1` | AWS region | |
| 130 | +| `environment` | `prod` | Environment name | |
| 131 | +| `artifact_ttl_hours` | `24` | Hours to keep artifacts in S3 | |
| 132 | +| `ec2_instance_type` | `c5.xlarge` | Spot instance type | |
| 133 | +| `ec2_volume_size` | `50` | EBS volume size (GB) | |
| 134 | +| `ec2_max_build_time_minutes` | `30` | Safety timeout per build | |
| 135 | +| `allowed_origins` | `["*"]` | CORS origins | |
| 136 | +| `docker_image_prefix` | `ghcr.io/fok666/lambda-python-layer` | Pre-built image registry | |
| 137 | + |
| 138 | +## API Reference |
| 139 | + |
| 140 | +### POST /builds |
| 141 | + |
| 142 | +Submit a new build request. |
| 143 | + |
| 144 | +```json |
| 145 | +{ |
| 146 | + "requirements": "numpy==1.26.4\nrequests==2.32.4", |
| 147 | + "python_version": "3.13", |
| 148 | + "architectures": ["x86_64", "arm64"], |
| 149 | + "single_file": true |
| 150 | +} |
| 151 | +``` |
| 152 | + |
| 153 | +**Response:** |
| 154 | +```json |
| 155 | +{ |
| 156 | + "build_id": "a1b2c3d4-...", |
| 157 | + "status": "QUEUED", |
| 158 | + "expires_at": 1709398800 |
| 159 | +} |
| 160 | +``` |
| 161 | + |
| 162 | +### GET /builds/{buildId} |
| 163 | + |
| 164 | +Check build status. Returns presigned download URLs when completed. |
| 165 | + |
| 166 | +**Response (completed):** |
| 167 | +```json |
| 168 | +{ |
| 169 | + "build_id": "a1b2c3d4-...", |
| 170 | + "status": "COMPLETED", |
| 171 | + "python_version": "3.13", |
| 172 | + "architectures": ["x86_64", "arm64"], |
| 173 | + "files": [ |
| 174 | + { |
| 175 | + "filename": "combined-python3.13-x86_64.zip", |
| 176 | + "download_url": "https://s3.amazonaws.com/...", |
| 177 | + "architecture": "x86_64" |
| 178 | + }, |
| 179 | + { |
| 180 | + "filename": "combined-python3.13-aarch64.zip", |
| 181 | + "download_url": "https://s3.amazonaws.com/...", |
| 182 | + "architecture": "arm64" |
| 183 | + } |
| 184 | + ] |
| 185 | +} |
| 186 | +``` |
| 187 | + |
| 188 | +## Security |
| 189 | + |
| 190 | +- **S3 bucket**: Private, no public access. Downloads via presigned URLs only |
| 191 | +- **EC2 instances**: No SSH, no inbound ports. Egress-only security group |
| 192 | +- **IMDSv2**: Enforced on all EC2 instances |
| 193 | +- **EBS encryption**: Enabled by default |
| 194 | +- **IAM**: Least-privilege policies per component |
| 195 | +- **DynamoDB TTL**: Automatic cleanup of old records |
| 196 | +- **S3 lifecycle**: Automatic deletion of old artifacts |
| 197 | + |
| 198 | +## Teardown |
| 199 | + |
| 200 | +```bash |
| 201 | +cd infrastructure/terraform |
| 202 | +terraform destroy |
| 203 | +``` |
| 204 | + |
| 205 | +> **Note:** S3 bucket must be empty before destruction. Terraform will fail if artifacts exist. Wait for lifecycle expiration or manually empty the bucket. |
0 commit comments