πWorldwide
Senior GCP Data Engineer
Xebia Poland
πRemote - Worldwide
Please let Xebia Poland know you found this job on JobsCollider. Thanks! π
Summary
Join Xebia as a Senior Data Engineer and collaborate with engineering, product, and data teams to deliver scalable and robust data solutions for global clients. You will design, build, and maintain data platforms and pipelines, mentor junior engineers, and work with various technologies. This role requires 5+ years of experience in a senior developer role, proficiency in GCP services (especially BigQuery), and expertise in Apache Airflow. Strong Python skills and a deep understanding of databases are essential. The position requires immediate availability and a work permit within the European Union.
Requirements
- Be available to start immediately
- Have 5+ years in a senior developer role, with hands-on experience in building data processing pipelines
- Possess proficiency with GCP services, especially BigQuery and BigQuery SQL, for large-scale data processing and optimization
- Have extensive experience with Apache Airflow, including DAG creation, triggers, and workflow optimization
- Possess knowledge of data partitioning, batch configuration, and performance tuning for terabyte-scale processing
- Have strong Python proficiency, with expertise in modern data libraries and frameworks (e.g., Databricks, Snowflake, Spark, SQL)
- Have experience with unit testing, pre-commit checks, and strict type enforcement for data pipelines
- Possess a deep understanding of relational and NoSQL databases, data modelling, and data warehousing concepts
- Have an excellent command of oral and written English
- Work from the European Union region and have a work permit
Responsibilities
- Work closely with engineering, product, and data teams to deliver our clients scalable and robust data solutions
- Design, build, and maintain data platforms and pipelines
- Mentor new engineers
- Work with various clients globally, delivering software systems and best practices for scalable and robust solutions
- Engineer data platforms for scale, performance, reliability, and security
- Integrate data sources and optimize data processing
- Proactively address challenges, resolve blockers, and drive effective communication across distributed teams
- Continuously seek opportunities to enhance data systems and ensure alignment with evolving business needs
Preferred Qualifications
- Have expertise in optimizing BigQuery performance using tools like Query Profiler and addressing compute resource bottlenecks
- Have prior experience developing or testing custom operators in Apache Airflow
- Possess familiarity with Docker, Kubernetes, Helm, Terraform, Kafka, and CI/CD pipelines for data environments
Share this job:
Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.
Similar Remote Jobs
πWorldwide
πWorldwide
πWorldwide
πIndia
π°$52k
πSlovak Republic
πCzechia
πMexico
π°$220k-$270k
πUnited States