Senior Data Platform Engineer

GumGum Logo

GumGum

πŸ“Remote - India

Summary

Join GumGum as a Senior Data Engineer and build, maintain, and optimize our Big Data systems using near real-time data analytics. You will refine our data infrastructure with technologies like Spark, Kafka, AirFlow, and Snowflake, leading projects and acting as a technical SME. Collaborate with stakeholders to translate business needs into data models and own core data pipelines. Design and build scalable systems, participate in code reviews, and mentor junior engineers. Leverage LLMs to enhance code quality and accelerate development. This role reports to the Data Platform Engineering Manager and is part of a global engineering team.

Requirements

  • Bachelor's degree in computer science/information systems or equivalent
  • 3+ year experience with Apache Spark, Snowflake, Airflow
  • 3+ years of Software Engineering experience (Java/Scala/Python)
  • 3+ years working with large-scale distributed real-time systems using tools such as AWS, Spark, Kafka, Hadoop
  • Minimum 5 years of overall industry experience is required
  • Spark with Scala, Java or Python
  • Strong understanding of SQL and NoSQL databases
  • Familiar with various AWS and GCP services, serverless architecture and containers
  • Must be able to write quality code and build secure, highly available systems
  • Strong oral and written communication

Responsibilities

  • Refining our data infrastructure using technologies such as Spark, Kafka, AirFlow and Snowflake to support both batch and real-time analysis of data
  • Lead projects end-to-end, influence architecture choices, act as a technical SME for specific domains (e.g., streaming and cost optimization)
  • Work with stakeholders in various departments to fully understand the business requirements and translate them into data models
  • Own the core data pipelines and scale our data processing flow
  • Design and build scalable systems, lead technical discussions, participate in code reviews, guide the team in engineering best practices. Must be able to write quality code and build secure, highly available systems
  • Generate reports using Snowflake SQL, Looker and Java
  • Lead POCs to certify and test the capabilities of emerging technologies
  • Maximize the growth of our junior and intermediate engineers via mentorship
  • Support our data platform by suggesting optimizations, establishing coding and release standards
  • Utilize LLMs to improve code quality, improve developer experience and accelerate feature development

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.

Similar Remote Jobs