Senior Aws Data Engineer

Logo of Xebia Poland

Xebia Poland

πŸ“Remote - Worldwide

Job highlights

Summary

Join Xebia, a global leader in digital solutions, and become a Data Engineer specializing in large-scale infrastructure design and deployment. You will build and maintain architecture patterns for data processing, translate technical designs into code, and establish automated processes for data analysis. Collaboration with data scientists and stakeholders is crucial, requiring excellent communication skills. This role demands 5+ years of data engineering experience, strong SQL and Python skills, and expertise in cloud data platforms like Databricks or Snowflake. The position requires immediate availability and willingness to work with the US timezone.

Requirements

  • Be available to start immediately
  • Have readiness to work with the US Timezone (up to 9PM CET)
  • Have 5+ years’ experience as a data engineer
  • Have 2+ years’ experience with AWS
  • Have strong SQL skills
  • Have Python scripting proficiency
  • Have hands-on experience in building data processing pipelines
  • Have experience in structuring and modelling data in both relational and non-relational forms
  • Have strong expertise in cloud data platforms/warehouses like Databricks or Snowflake
  • Have extensive experience in big data engineering on a terabyte scale, including streaming technologies and near-real-time processing
  • Have experience working with VCS like Git
  • Have excellent command of oral and written English
  • Have ability to work with different stakeholders and drive consensus within the team
  • Have good verbal and written communication skills in English
  • Work from the European Union region and a work permit are required

Responsibilities

  • Be responsible for at-scale infrastructure design, build and deployment with a focus on distributed systems
  • Build and maintain architecture patterns for data processing, workflow definitions, and system to system integrations using Big Data and Cloud technologies
  • Evaluate and translate technical design to workable technical solutions/code and technical specifications at par with industry standards
  • Drive creation of re-usable artifacts
  • Establish scalable, efficient, automated processes for data analysis, data model development, validation, and implementation
  • Work closely with analysts/data scientists to understand impact to the downstream data models
  • Write efficient and well-organized software to ship products in an iterative, continual release environment
  • Contribute and promote good software engineering practices across the team
  • Communicate clearly and effectively to technical and non-technical audiences
  • Define data retention policies
  • Monitor performance and advise any necessary infrastructure changes

Preferred Qualifications

  • Have proficiency in designing and implementing ETL/ELT processes and data integration workflows using tools like Apache Airflow, AWS Glue
  • Have understanding of big data and DevOps technologies (Kafka, Spark, Helm, Terraform)
  • Have experience in CI/CD for the data environment
  • Have experience in testing for data processing
  • Have ML models operationalization (e.g., in Docker, Kubernetes)

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.
Please let Xebia Poland know you found this job on JobsCollider. Thanks! πŸ™