πUnited States
Aws Data Architect

Xebia Poland
πRemote - Worldwide
Please let Xebia Poland know you found this job on JobsCollider. Thanks! π
Summary
Join Xebia, a global leader in digital solutions, and contribute your expertise to our cloud-based data platform architecture. We are seeking a highly experienced data architect with proven success in building cloud-native data-intensive applications on AWS (GCP experience a plus). You will play a key role in validating our Data Platform Architecture proposal, developing governance policies, defining FinOps practices, and establishing architectural best practices using Databricks. This position requires immediate availability and a work permit to work in Poland. Your responsibilities include collaborating with our team to design, implement, and maintain scalable and secure data architectures.
Requirements
- Be ready to start immediately
- Have many proven experiences on building cloud native data intensive applications
- Have Amazon Web Service experience
- Design and implement scalable, efficient, and secure data architectures, including data lakes, data warehouses, and data marts
- Have experience with data mesh, data fabric and other methodologies
- Be proficient in defining and enforcing data architecture principles, standards, and best practices
- Be familiar with modern cloud native data stack
- Have hands-on experience of building and maintaining Spark applications
- Have a fundamental understanding of various Parquet, Delta Lake and other OTFs file formats
- Have strong written and verbal English communication skill and be proficient in communication with non-technical stakeholders
- Work from Poland and have a work permit to work from Poland
Responsibilities
- Validate our proposal on Data Platform Architecture (ML Platform, DWH) that includes Databricks caps usage and integration with external Airflow (AWS MWAA) and other AWS native services
- Help with developing governance policies around target organisation and usage patterns (workspaces organisation, IAM including programmatic access to S3 buckets, deeper understanding of data cataloguing with Unity Catalog or similar)
- Help with defining granular Finops practices on top of above-mentioned structure
- Define architectural best practices using Databricks and data platforms in general, together with our DF team
- Provide best practices and directional thinking around Workspace & Infrastructure creation and Isolation guidelines
Preferred Qualifications
Have good experience in GCP
Share this job:
Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.
Similar Remote Jobs
πWorldwide
πWorldwide
πUnited Kingdom
πGermany
π°$115k-$180k
πUnited States
πUnited States
πUnited States
πUnited States