Open Source · Personal Project
Apollo
Hardware · Cloud-native · AI-powered
From dual-MCU edge devices to Kubernetes-deployed microservices with full observability. End to end.
[view on github ↗][ mobile app screenshot — replace with public/assets/apollo-mobile.png ]
// ai video analysis
Camera feeds are analyzed by Vision Language Models at the edge. Not motion detection — actual scene understanding. The system can reason about what it sees, not just detect pixel changes.
// architecture
[Dual-MCU Hardware] → MQTT → [API Gateway] → services
[React Native App] → [API Gateway] → services
[React Native App] → [API Gateway] → services
api-gateway Single entry point — routing, auth middleware, Swagger aggregation
device-service Manages registered hardware devices and their state
media-analysis-service Feeds camera frames to VLMs for scene understanding
home-service Home configuration and zone management
user-service Authentication, user profiles, JWT issuance
notification-service Push notifications and alert delivery
file-storage-service Media storage and retrieval via MinIO
// observability
| metrics: | Prometheus + Grafana |
| logs: | Loki |
| traces: | Tempo |
| profiling: | Pyroscope |
| deployed: | Kubernetes (K3s) |
// testing infrastructure
- → Hardware device simulator (React app)
- → Locust load testing suite
- → Swagger UI via API Gateway
[react native][spring boot][kubernetes][mqtt][minio][grafana][prometheus][loki][tempo][pyroscope][vlm][dual-mcu][java][expo]