
Lead Data Engineer

Supermetrics
Summary
Join Supermetrics as a Lead Data Platform Engineer and lead impactful technical initiatives, shaping the future of data at a fast-growing SaaS company. You will be instrumental in developing and maintaining our advanced internal data platform, crucial for data ingestion, transformation, visualization, and data-driven decisions. As a lead engineer, you will be a senior technical voice, guiding technical decisions and mentoring another senior engineer. Your primary focus will be on developing, maintaining, and enhancing the platform's capabilities, ensuring scalability, reliability, and efficiency. You will work with public cloud providers, Kubernetes, and various data processing tools, collaborating closely with the Head of Data. Supermetrics offers a competitive compensation package, excellent work equipment, flexible remote/hybrid policy, healthcare benefits, a personal learning budget, and more.
Requirements
- Extensive experience in platform engineering, data engineering, or software engineering with a focus on data infrastructure
- Extensive hands-on experience with Cloud Platforms
- Proficiency in Python programming
- Solid experience developing, deploying and managing data platforms, applications and big data pipelines on Kubernetes
- Experience with workflow orchestration tools, specifically Airflow (Cloud Composer)
- Experience with CI/CD principles and tools, particularly ArgoCD
- Experience in domains such as event tracking, data integration, and building/maintaining data platforms
- Ability to work independently, take ownership of technical areas, and move projects to completion
- Advanced problem-solving and technical leadership skills
- Effective communication skills and ability to mentor other engineers
Responsibilities
- Be responsible for the technical design, development, and maintenance of core data platform components and services
- Execute technical decision-making for the evolution of the data platform architecture and technology stack
- Engage in hands-on development using Python, Kubernetes, and other relevant technologies to build and improve platform functionalities
- Maintain and enhance our event tracking infrastructure based on Snowplow/OpenSnowcat, custom data loaders, and associated tooling
- Manage and operate data integration pipelines and tools like Cloud Data Fusion and Cloud Composer (Airflow)
- Implement and manage deployment strategies using Kubernetes, ArgoCD, and GitOps principles
- Maintain and improve data platform tooling, including dbt, Looker technical setup, and custom internal tools
- Act as a senior technical mentor to another experienced engineer on the team
- Collaborate with development teams in a consultative role regarding data collection and integration best practices
- Ensure the reliability, scalability, observability and usability of the data platform
Preferred Qualifications
- Experience designing or working with event-driven architectures
- Familiarity with the concept and application of data contracts
- Experience with Snowplow or similar event-tracking technologies
- Experience with Infrastructure as Code tools like Terraform
- Familiarity with dbt or Looker
- Experience developing services or applications in Go or Rust
Benefits
- Competitive compensation package, including equity and bonus
- Excellent work equipment and flexible remote/hybrid policy
- Health care benefits and leisure time insurance
- Personal learning budget
- Sports and wellbeing allowance
Share this job:
Similar Remote Jobs



