Lead Data Platform Engineer

Supermetrics Logo

Supermetrics

πŸ“Remote - Finland

Summary

Join our Data Platform team as a Lead Data Platform Engineer and play a key role in developing and maintaining our advanced internal data platform. You will lead technical efforts, guide technical decisions, and mentor another senior engineer. Your primary focus will be on developing, maintaining, and enhancing the platform's capabilities, ensuring scalability, reliability, and efficiency. You will work with public cloud providers, Kubernetes, and various data processing tools. Collaborate closely with the Head of Data to meet the company's data needs. This is a senior technical role requiring extensive experience in platform engineering and data infrastructure.

Requirements

  • Extensive experience in platform engineering, data engineering, or software engineering with a focus on data infrastructure
  • Extensive hands-on experience with Cloud Platforms
  • Proficiency in Python programming
  • Solid experience developing, deploying and managing data platforms, applications and big data pipelines on Kubernetes
  • Experience with workflow orchestration tools, specifically Airflow (Cloud Composer)
  • Experience with CI/CD principles and tools, particularly ArgoCD
  • Experience in domains such as event tracking, data integration, and building/maintaining data platforms
  • Ability to work independently, take ownership of technical areas, and move projects to completion
  • Advanced problem-solving and technical leadership skills
  • Effective communication skills and ability to mentor other engineers

Responsibilities

  • Be responsible for the technical design, development, and maintenance of core data platform components and services
  • Execute technical decision-making for the evolution of the data platform architecture and technology stack
  • Engage in hands-on development using Python, Kubernetes, and other relevant technologies to build and improve platform functionalities
  • Maintain and enhance our event tracking infrastructure based on Snowplow/OpenSnowcat, custom data loaders, and associated tooling
  • Manage and operate data integration pipelines and tools like Cloud Data Fusion and Cloud Composer (Airflow)
  • Implement and manage deployment strategies using Kubernetes, ArgoCD, and GitOps principles
  • Maintain and improve data platform tooling, including dbt, Looker technical setup, and custom internal tools
  • Act as a senior technical mentor to another experienced engineer on the team
  • Collaborate with development teams in a consultative role regarding data collection and integration best practices
  • Ensure the reliability, scalability, observability and usability of the data platform

Preferred Qualifications

  • Experience designing or working with event-driven architectures
  • Familiarity with the concept and application of data contracts
  • Experience with Snowplow or similar event-tracking technologies
  • Experience with Infrastructure as Code tools like Terraform
  • Familiarity with dbt or Looker
  • Experience developing services or applications in Go or Rust

Benefits

  • Competitive compensation package, including equity and bonus
  • Excellent work equipment and home office allowance for those working in our fully remote locations
  • Health care benefits and leisure time insurance
  • Personal learning budget
  • Sports and wellbeing allowance

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.

Similar Remote Jobs