Senior GCP Data Engineer

Logo of Xebia Poland

Xebia Poland

πŸ“Remote - Worldwide

Job highlights

Summary

Join Xebia as a Senior Data Engineer and collaborate with engineering, product, and data teams to deliver scalable and robust data solutions for global clients. You will design, build, and maintain data platforms and pipelines, mentor junior engineers, and work with various technologies like GCP, BigQuery, and Apache Airflow. This role requires 5+ years of experience in a senior developer role with hands-on experience in building data processing pipelines and proficiency in GCP services, especially BigQuery and BigQuery SQL. The ideal candidate will possess strong Python proficiency and a deep understanding of relational and NoSQL databases. The position is open only to candidates residing in Moldova with the legal right to work there.

Requirements

  • Be available to start immediately
  • Have 5+ years in a senior developer role, with hands-on experience in building data processing pipelines
  • Possess proficiency with GCP services, especially BigQuery and BigQuery SQL, for large-scale data processing and optimization
  • Have extensive experience with Apache Airflow, including DAG creation, triggers, and workflow optimization
  • Possess knowledge of data partitioning, batch configuration, and performance tuning for terabyte-scale processing
  • Have strong Python proficiency, with expertise in modern data libraries and frameworks (e.g., Databricks, Snowflake, Spark, SQL)
  • Have experience with unit testing, pre-commit checks, and strict type enforcement for data pipelines
  • Possess a deep understanding of relational and NoSQL databases, data modelling, and data warehousing concepts
  • Have an excellent command of oral and written English
  • Currently reside in Moldova and hold the legal right to work in Moldova

Responsibilities

  • Work closely with engineering, product, and data teams to deliver our clients scalable and robust data solutions
  • Design, build, and maintain data platforms and pipelines
  • Mentor new engineers
  • Work with various clients globally, delivering software systems and best practices for scalable and robust solutions
  • Engineer data platforms for scale, performance, reliability, and security
  • Integrate data sources and optimize data processing
  • Proactively address challenges, resolve blockers, and drive effective communication across distributed teams
  • Continuously seek opportunities to enhance data systems and ensure alignment with evolving business needs

Preferred Qualifications

  • Have expertise in optimizing BigQuery performance using tools like Query Profiler and addressing compute resource bottlenecks
  • Have prior experience developing or testing custom operators in Apache Airflow
  • Possess familiarity with Docker, Kubernetes, Helm, Terraform, Kafka, and CI/CD pipelines for data environments

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.
Please let Xebia Poland know you found this job on JobsCollider. Thanks! πŸ™