Senior Data Engineer Team Lead

Dev.Pro Logo

Dev.Pro

πŸ“Remote - Brazil

Summary

Join a team in Brazil, Argentina, or Colombia as a Senior Data Engineer, working remotely to modernize data pipelines for a major enterprise client. Lead the Data Engineering team in migrating legacy data pipelines to a custom-built cloud environment. Collaborate with architects, DevOps, QA, and product stakeholders. Utilize modern technologies like GCP, Snowflake, and dbt. The role combines hands-on engineering with team leadership to ensure timely project delivery and success. The company offers a remote-first work environment and various benefits.

Requirements

  • 5+ years in data engineering and data warehouse modeling
  • Strong proficiency in designing and building ETL for large data volumes and streaming solutions
  • Expert-level SQL skills and experience in Snowflake and Apache Iceberg tables
  • Hands-on experience with GCP services (BigQuery, GCS, Airflow, Dataflow, Dataproc, Pub/Sub)
  • Proficiency in Python for ETL scripting and DAG development
  • Experience using dbt for data transformation and orchestration
  • Familiarity with CI/CD processes and tools (Git, Terraform, Serverless)
  • Degree in Computer Science, Data Engineering, Information Systems, or related fields
  • Strong leadership skills with proven experience guiding technical teams
  • Strong communication and collaboration abilities
  • Upper-Intermediate+ English level

Responsibilities

  • Review and analyze existing ETL solutions for migration to the new architecture
  • Design, optimize, and migrate batch and streaming data pipelines to the GCP Landing Zone
  • Build and manage data transformations with dbt, supporting ELT pipelines in Snowflake
  • Ensure the new data infrastructure meets performance and quality SLAs/SLOs
  • Implement monitoring and alerting for pipelines to ensure system fault tolerance
  • Develop migration scripts to transfer historical data to Iceberg tables
  • Act as a liaison between the technical team and the client to ensure clear communication
  • Break down complex tasks into smaller, manageable technical deliverables for the team
  • Proactively identify risks and take steps to mitigate them

Preferred Qualifications

  • Experience building and managing streaming data pipelines and event-driven architectures
  • Experience writing Bash scripts
  • Experience with Java for Dataflow jobs
  • Familiarity with data lakehouse architectures using Iceberg tables
  • Proficiency with Docker for containerizing data pipelines and supporting orchestration
  • Familiarity with AI-assisted tools like GitHub Copilot

Benefits

  • Join a fully integrated delivery team built on collaboration, transparency, and mutual respect
  • Lead a skilled Data Engineering team through a high-impact data platform transformation in a production environment
  • Work hands-on with modern, in-demand technologies like GCP, Snowflake, Apache Iceberg, dbt, Airflow, Dataflow, and BigQuery
  • Hands on experience with Google Landing Zones
  • We are 99.9% remote β€” you can work from anywhere in the world
  • Get 30 paid days off per year to use however you like β€” vacations, holidays, or personal time
  • 5 paid sick days, up to 60 days of medical leave, and up to 6 paid days off per year for major family events like weddings, funerals, or the birth of a child
  • Partially covered health insurance after the probation, plus a wellness bonus for gym memberships, sports nutrition, and similar needs after 6 months
  • We pay in U.S. dollars and cover all approved overtime
  • Join English lessons and Dev.Pro University programs, and take part in fun online activities and team-building events

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.