Senior Data Platform Engineer

Apollo.io Logo

Apollo.io

πŸ“Remote - United States

Summary

Join Apollo.io as a Senior Data Platform Engineer and play a key role in designing and building the foundational data infrastructure and APIs that power our analytics, machine learning, and product features. You will develop scalable data pipelines, manage cloud-native data platforms, and create high-performance APIs using FastAPI. This hands-on role offers opportunities to influence architecture, tooling, and best practices. Collaborate with various teams to translate requirements into engineering solutions and ensure the health and reliability of data flows. Continuous evaluation and integration of new technologies are expected. Document your work, champion best practices, and contribute to team knowledge sharing.

Requirements

  • 5+ years experience in platform engineering, data engineering or in a data facing role
  • Experience in building data applications
  • Deep knowledge of data eco system with an ability to collaborate cross-functionally
  • Bachelor's degree in a quantitative field (Physical / Computer Science, Engineering or Mathematics / Statistics)

Responsibilities

  • Architect and build robust, scalable data pipelines (batch and streaming) to support a variety of internal and external use cases
  • Develop and maintain high-performance APIs using FastAPI to expose data services and automate data workflows
  • Design and manage cloud-based data infrastructure, optimizing for cost, performance, and reliability
  • Collaborate closely with software engineers, data scientists, analysts, and product teams to translate requirements into engineering solutions
  • Monitor and ensure the health, quality, and reliability of data flows and platform services
  • Implement observability and alerting for data services and APIs (think logs, metrics, dashboards)
  • Continuously evaluate and integrate new tools and technologies to improve platform capabilities
  • Contribute to architectural discussions, code reviews, and cross-functional projects
  • Document your work, champion best practices, and help level up the team through knowledge sharing

Preferred Qualifications

  • Experience using the Python data stack
  • Experience deploying and managing data pipelines in the cloud
  • Experience working with technologies like Airflow, Hadoop and Spark
  • Understanding of streaming technologies like Kafka, Spark Streaming

Benefits

  • We invest deeply in your growth, ensuring you have the resources, support, and autonomy to own your role and make a real impact
  • Collaboration is at our coreβ€”we’re all for one , meaning you’ll have a team across departments ready to help you succeed
  • We encourage bold ideas and courageous action , giving you the freedom to experiment, take smart risks, and drive big wins

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.

Similar Remote Jobs