Senior GCP Data Engineer

Logo of Xebia Poland

Xebia Poland

πŸ“Remote - Worldwide

Job highlights

Summary

Join Xebia as a Senior Data Engineer and collaborate with engineering, product, and data teams to deliver scalable and robust data solutions for global clients. You will design, build, and maintain data platforms and pipelines, mentor junior engineers, and work with various technologies. This role requires 5+ years of experience in a senior developer role, proficiency in GCP services (especially BigQuery), and expertise in Apache Airflow. Strong Python skills and a deep understanding of databases are essential. The position requires working from within the European Union and possessing a valid work permit.

Requirements

  • Be available to start immediately
  • 5+ years in a senior developer role, with hands-on experience in building data processing pipelines
  • Proficiency with GCP services, especially BigQuery and BigQuery SQL, for large-scale data processing and optimization
  • Extensive experience with Apache Airflow, including DAG creation, triggers, and workflow optimization
  • Knowledge of data partitioning, batch configuration, and performance tuning for terabyte-scale processing
  • Strong Python proficiency, with expertise in modern data libraries and frameworks (e.g., Databricks, Snowflake, Spark, SQL)
  • Experience with unit testing, pre-commit checks, and strict type enforcement for data pipelines
  • Deep understanding of relational and NoSQL databases, data modelling, and data warehousing concepts
  • Excellent command of oral and written English
  • Work from the European Union region and a work permit are required

Responsibilities

  • Work with various clients globally, delivering software systems and best practices for scalable and robust solutions
  • Engineer data platforms for scale, performance, reliability, and security
  • Integrate data sources and optimize data processing
  • Proactively address challenges, resolve blockers, and drive effective communication across distributed teams
  • Continuously seek opportunities to enhance data systems and ensure alignment with evolving business needs
  • Mentoring new engineers

Preferred Qualifications

  • Expertise in optimizing BigQuery performance using tools like Query Profiler and addressing compute resource bottlenecks
  • Prior experience developing or testing custom operators in Apache Airflow
  • Familiarity with Docker, Kubernetes, Helm, Terraform, Kafka, and CI/CD pipelines for data environments

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.
Please let Xebia Poland know you found this job on JobsCollider. Thanks! πŸ™