Data Platform Engineer

Logo of One

One

πŸ“Remote - United States

Job highlights

Summary

Join One, a fintech company backed by Ribbit and Walmart, as a Data Platform Engineer. You will design, develop, and maintain our data infrastructure, including data pipelines and data warehouses. This pivotal role will drive our data strategy, support machine learning operations at scale, and involve close collaboration with data science and engineering teams. You will be responsible for building robust and scalable data platforms, ensuring real-time data flow, and establishing MLOps workflows. The ideal candidate possesses extensive experience in data engineering, expertise in Apache Spark and Python, and strong SQL skills. This is an opportunity to contribute to a company focused on helping customers achieve financial progress.

Requirements

  • 5+ years of experience in data engineering or a similar role
  • Expertise in Apache Spark for large-scale data processing
  • Advanced knowledge of production-level Python
  • Strong SQL skills for data manipulation and ETL processes
  • Experience with real-time streaming technologies such as Kafka or Kinesis
  • Familiarity with MLOps practices and workflows, including feature engineering, model training, and serving

Responsibilities

  • Design, implement, and maintain robust streaming and batch data pipelines using Databricks, Spark, and Python
  • Ensure data infrastructure is reliable, secure, and scalable, adhering to industry regulations and data governance standards
  • Monitor, troubleshoot, and proactively improve data infrastructure to ensure high availability and performance
  • Build and optimize data platforms and warehouses to meet the evolving needs of stakeholders across analytics, machine learning, and backend systems
  • Assist in the re-architecture of batch pipelines to streaming pipelines, ensuring real-time data flow for ML and operational needs
  • Collaborate with data scientists and analysts to streamline data processing for advanced analytics and machine learning
  • Establish and maintain MLOps workflows, ensuring seamless deployment, monitoring, and serving of ML models and features
  • Drive feature engineering and create systems to stage and serve features for machine learning

Preferred Qualifications

  • Proficiency in Databricks for managing data pipelines and analytics workflows
  • Infrastructure as Code (IaC), Terraform or AWS preferred
  • Strong problem-solving skills and the ability to work collaboratively in cross-functional teams
  • An β€œact-like-an-owner” mentality with a bias toward taking action

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.

Similar Remote Jobs

Please let One know you found this job on JobsCollider. Thanks! πŸ™