Senior Data Engineer

Swiftly Logo

Swiftly

πŸ’΅ $78k-$84k
πŸ“Remote - Argentina

Summary

Join Swiftly, a retail digital technology startup, as a Senior Data Engineer to scale our Retailer Platform using Azure, Databricks, and API development. This full-time contract role involves architecting and implementing scalable data pipelines, designing data models, leading ETL/ELT workflows, ensuring data quality and governance, and partnering with cross-functional teams. The ideal candidate will have 5+ years of data engineering experience, expertise in PySpark, SQL, and modern data lake/lakehouse concepts, and experience with Delta Lake and CI/CD pipelines. Swiftly offers a collaborative environment and values employees who take ownership and are always learning. We are an Equal Opportunity Employer.

Requirements

  • 5+ years of data engineering experience, with 3+ years of hands-on experience in Databricks
  • Expert in PySpark, SQL, and modern data lake/lakehouse concepts
  • Strong understanding of cloud platforms (AWS, Azure, or GCP)
  • Experience with Delta Lake, Unity Catalog, and CI/CD pipelines for data
  • Proven ability to lead complex data initiatives from design to delivery
  • Experience with Medallion Architecture
  • Has demonstrated the ability to work collaboratively in an ambiguous, fast-paced environment
  • Takes ownership of their domain from the ground up, from inception through deployment to customers
  • Leaves their ego at the door and ensures the best idea leaves the room
  • Is always experimenting with new technologies and learning new skillsets

Responsibilities

  • Architect and implement scalable data pipelines using Databricks, Apache Spark, and Delta Lake
  • Design and optimize data models and lakehouse architectures for performance and reliability
  • Lead the development of ETL/ELT workflows in collaboration with data analysts, scientists, and stakeholders
  • Ensure data quality, governance, and security best practices across the data ecosystem
  • Partner with cross-functional teams to understand data needs and deliver effective solutions
  • Other related duties as assigned

Preferred Qualifications

  • Experience with MLflow, Data Governance frameworks, and streaming data (Structured Streaming, Kafka) a plus
  • Experience with Databricks Workflows and Postgres

Benefits

$6,500 - $7,000 a month

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.

Similar Remote Jobs