Senior/Lead Data Engineer

Manila Recruitment Logo

Manila Recruitment

πŸ“Remote - Philippines

Summary

Join our client, an Australian data-driven trading business, as a Senior/Lead Data Engineer in their expanding Philippines team. Design and operate a multi-cluster Apache Kafka system, build real-time stream-processing pipelines using Apache Flink, and manage data persistence in fast analytical stores like Redis. You will also create batch back-fill tools, manage object storage, and establish observability and CI/CD infrastructure. Collaborate closely with the founder on roadmap planning and hardware selection. This role requires extensive experience in data engineering, Apache Kafka, and real-time stream processing. Success in this position will lead to onboarding future engineers.

Requirements

  • 5+ years of Data Engineer experience
  • Autonomous ownership - plan, prioritise and deliver without hand-holding
  • Apache Kafka operations and performance tuning
  • 2+ years building real-time stream-processing pipelines in production
  • Clear, concise async communication
  • Calm under pressure / incident composure
  • Proficiency in Java or Python, plus solid SQL
  • Some understanding of distributed-systems basics (partitioning, watermarking, checkpointing, back-pressure)
  • Linux administration and containerisation (Docker/Podman)

Responsibilities

  • Design & run a multi-cluster Apache Kafka backbone that can grow beyond 200 Mbps sustained ingest
  • Build stateful pipelines in Apache Flink (DataStream API & Flink SQL) for enrichment, cleansing, aggregation and feature generation
  • Persist processed streams into fast analytical stores (e.g. Redis) for low-latency look-ups and dashboards
  • Create batch back-fill tooling for historical re-processing when needed
  • Manage object storage (Amazon S3 / MinIO / Ceph) and define retention / lifecycle policies
  • Stand up observability (Prometheus, Grafana), CI/CD, and infrastructure-as-code with Terraform from scratch
  • Work day-to-day with the founder on roadmap, budget and hardware choices-and, if success follows, help onboard the next engineers

Preferred Qualifications

  • Hands-on Apache Flink experience - deploying, scaling and securing clusters
  • Schema-driven serialisation (Avro / Protobuf) and schema-registry management
  • Robust web scraping techniques for structured and unstructured sports data
  • Experience with object storage platforms (Amazon S3, Ceph, or MinIO)
  • High throughput key-value stores (Redis or RocksDB)
  • Familiarity with modern virtualisation platforms (Kubernetes, Proxmox, VMware)
  • Infrastructure-as-Code with Terraform (or similar)
  • Rapid self-learning & curiosity
  • Problem slicing & pragmatic decision making
  • Bias for automation & documentation
  • Apache Pulsar, Faust or equivalent streaming frameworks
  • DuckDB, ClickHouse or similar OLAP engines
  • Observability stack: Prometheus, Loki, Grafana, Tempo, OpenTelemetry
  • CI/CD with GitHub Actions, ArgoCD or Flux
  • Collaborative knowledge sharing
  • Domain exposure to sports data, trading or high-frequency market making

Benefits

  • Full time work from home offshore contractor employment from our client’s Australian headquartered company
  • 13th month pay guaranteed
  • 30 days combined leaves and 1 birthday leave (Day 1)
  • HMO Allowance: Php5,000
  • One off ergonomic kit of Php12,000 (Day 1)
  • Monthly Wellness Allowance Php1,000
  • UPS or LTE fail-over stipend (choose one)
  • Work equipment will be provided
  • Observance of PH regular Holidays
  • Annual Performance Bonus 10-20%, Salary based on KPI (to be given late April – Pro rated)
  • All expense paid trip to Brisbane HQ at the end of 2025 – 2026 (NBA Regular season, late April 2026)

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.