Intermediate Data Engineer

Dev.Pro Logo

Dev.Pro

πŸ“Remote - Brazil

Summary

Join Dev.Pro as an Intermediate Data Engineer and contribute to a large-scale data modernization effort for a major enterprise client. You will migrate and transform complex legacy data pipelines to a modern cloud environment. Collaborate with architects, DevOps, QA, and product stakeholders to deliver scalable, reliable data solutions. This role involves hands-on work with modern technologies like GCP, Snowflake, Apache Iceberg, dbt, Airflow, Dataflow, and BigQuery. The position requires 3+ years of data engineering experience, strong SQL skills, and proficiency in Python and ETL processes. Dev.Pro offers a fully remote work environment and various benefits.

Requirements

  • 3+ years in data engineering and data warehouse modeling
  • Strong proficiency in designing and building ETL for large data volumes and streaming solutions
  • Expert-level SQL skills and experience in Snowflake and Apache Iceberg tables
  • Hands-on experience with GCP services (BigQuery, GCS, Airflow, Dataflow, Dataproc, Pub/Sub)
  • Proficiency in Python for ETL scripting and DAG development
  • Experience using dbt for data transformation and orchestration
  • Familiarity with CI/CD processes and tools (Git, Terraform, Serverless)
  • Degree in Computer Science, Data Engineering, Information Systems, or related fields
  • Strong communication and collaboration abilities
  • Upper-Intermediate+ English level

Responsibilities

  • Review and analyze existing ETL solutions for migration to the new architecture
  • Design, optimize, and migrate batch and streaming data pipelines to the GCP Landing Zone
  • Build and manage data transformations with dbt, supporting ELT pipelines in Snowflake
  • Ensure the new data infrastructure meets performance and quality SLAs/SLOs
  • Implement monitoring and alerting for pipelines to ensure system fault tolerance
  • Develop migration scripts to transfer historical data to Iceberg tables
  • Collaborate closely with the team and other stakeholders to align on data requirements and solutions
  • Participate in code reviews, design discussions, and technical planning

Preferred Qualifications

  • Experience building and managing streaming data pipelines and event-driven architectures
  • Experience writing Bash scripts
  • Experience with Java for Dataflow jobs
  • Familiarity with data lakehouse architectures using Iceberg tables
  • Proficiency with Docker for containerizing data pipelines and supporting orchestration
  • Familiarity with AI-assisted tools like GitHub Copilot

Benefits

  • Join a fully integrated delivery team built on collaboration, transparency, and mutual respect
  • Contribute to high-impact data platform transformation and gain experience with Google Landing Zones
  • Work hands-on with modern, in-demand technologies like GCP, Snowflake, Apache Iceberg, dbt, Airflow, Dataflow, and BigQuery
  • We are 99.9% remote β€” you can work from anywhere in the world
  • Get 30 paid days off per year to use however you like β€” vacations, holidays, or personal time
  • 5 paid sick days, up to 60 days of medical leave, and up to 6 paid days off per year for major family events like weddings, funerals, or the birth of a child
  • Partially covered health insurance after the probation, plus a wellness bonus for gym memberships, sports nutrition, and similar needs after 6 months
  • We pay in U.S. dollars and cover all approved overtime
  • Join English lessons and Dev.Pro University programs, and take part in fun online activities and team-building events

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.