Data Systems Engineer — Sydney, Australia
Building data infrastructure that handles real workloads. Always has.
I design and build data systems that scale — from trillion-row data warehouses to ML pipelines processing millions of images daily. My background spans regulated industries (healthcare, finance) and companies of all sizes, from global enterprise to fast-growing startups.
Core strengths:
- Data engineering at scale (Spark, Kafka, Snowflake, Cassandra)
- ML pipelines (computer vision, embeddings, similarity search)
- Distributed systems (Java/Scala, Akka, Apache Ignite)
- Cloud architecture (AWS, Kubernetes, Docker)
- API design (FHIR, REST, GraphQL)
Embedded graph-vector database. Combines knowledge graphs with vector search — runs 100% locally, no cloud, no subscriptions.
Australian property analytics engine built with Tauri/Rust. Market data, rental insights, investment metrics.
Stock market scanner using Qullamaggie methodology. 20-point scoring system with live Yahoo Finance data.
AI study companion for educational videos — transcripts, visual summaries, smart navigation. React + Python pipeline.
Production-ready FHIR R4 client for healthcare interoperability. Full HL7 compliance.
Active contributor to open-source distributed caching platform.
Languages: Python, Scala, Java, TypeScript
Data: Spark, Kafka, Snowflake, Cassandra, PostgreSQL, Elasticsearch
ML: PyTorch, ONNX, HNSW, scikit-learn
Cloud: AWS, Kubernetes, Docker
Frameworks: Spring Boot, Akka, React, Tauri
- MSCI/IPD: Built real estate calculation engine with 3x throughput using custom off-heap solutions
- BNP Paribas: Global settlement systems (Tokyo, Hong Kong)
- Tata TCS: Core banking systems for State Bank of India (10,000+ branches)
I build systems that work — in regulated industries, at scale, across the stack.





