Aws Data Architect

Xebia Poland Logo

Xebia Poland

πŸ“Remote - Worldwide

Summary

Join Xebia, a global leader in digital solutions, and contribute your expertise to our team. We are seeking a Data Platform Architect with extensive experience in cloud-native data-intensive applications, particularly on AWS (GCP experience is a plus). You will validate our Data Platform Architecture proposal, develop governance policies, define FinOps practices, and establish architectural best practices using Databricks. This role requires immediate availability and strong communication skills. The ideal candidate will have experience with data mesh, data fabric, and various data formats. Apply now to begin the conversation and join the #Xebia team.

Requirements

  • Be ready to start immediately
  • Have many proven experiences on building cloud native data intensive applications
  • Have Amazon Web Service experience
  • Design and implement scalable, efficient, and secure data architectures, including data lakes, data warehouses, and data marts
  • Have experience with data mesh, data fabric and other methodologies
  • Be proficient in defining and enforcing data architecture principles, standards, and best practices
  • Be familiar with modern cloud native data stack
  • Have hands-on experience of building and maintaining Spark applications
  • Have a fundamental understanding of various Parquet, Delta Lake and other OTFs file formats
  • Have strong written and verbal English communication skill and be proficient in communication with non-technical stakeholders
  • Currently reside in Moldova and hold the legal right to work in Moldova

Responsibilities

  • Validate our proposal on Data Platform Architecture (ML Platform, DWH) that includes Databricks caps usage and integration with external Airflow (AWS MWAA) and other AWS native services
  • Help with developing governance policies around target organisation and usage patterns (workspaces organisation, IAM including programmatic access to S3 buckets, deeper understanding of data cataloguing with Unity Catalog or similar)
  • Help with defining granular Finops practices on top of above-mentioned structure
  • Define architectural best practices using Databricks and data platforms in general, together with our DF team
  • Provide best practices and directional thinking around Workspace & Infrastructure creation and Isolation guidelines

Preferred Qualifications

Have good experience in GCP

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.

Similar Remote Jobs