Phase 1 implementation of the project-flow PoC:
POST /projects- stores projects in Postgres
- returns
201 Created
The project is structured for the next phases with a service layer, a dedicated events package, and golang-migrate SQL migrations.
Event publishing is stubbed for now. Broker integration comes in the next phase.
The service currently follows a simple PoC flow: it inserts the project first and only then attempts to publish the event.
That means a future publish failure would leave the project persisted in Postgres while the API still returns an error. For this PoC that tradeoff is acceptable, but it is not atomic across Postgres and RabbitMQ.
The likely next production-oriented improvement is an outbox pattern so the database write and event recording happen in one transaction before asynchronous publishing.
make migrate
docker compose up --build apiAPI will be available at http://localhost:8080.
Migrations are run explicitly through golang-migrate and are not applied automatically when the API starts.
curl -X POST http://localhost:8080/projects \
-H "Content-Type: application/json" \
-d '{
"name": "project-flow",
"repo_url": "https://github.com/st2f/project-flow",
"description": "Event-driven project creation workflow using Go, RabbitMQ, and Postgres"
}'Example response:
{
"id": 1,
"name": "project-flow",
"repo_url": "https://github.com/st2f/project-flow",
"description": "Event-driven project creation workflow using Go, RabbitMQ, and Postgres",
"created_at": "2026-04-05T18:00:00Z"
}make testproject-flow/
├── cmd/api
├── internal/config
├── internal/db
├── internal/events
├── internal/project
├── migrations
├── docker-compose.yml
├── Dockerfile
├── go.mod
└── Makefile
Note: configuration is provided through environment variables (Docker Compose in local setup).
.env.example documents expected variables but is not loaded automatically.