Senior Data Engineer

Twilio Logo

Twilio

💵 $126k-$185k
📍Remote - United States

Summary

Join Twilio's Segment Data Platform Team as a Senior Data Engineer to build and scale the next generation of the company's data platform. You will work on high-throughput data ingestion, processing, and enrichment systems, serving as the backbone for the customer data platform (CDP). This role involves building scalable data pipelines using Spark, Scala, and cloud-native services, improving system performance and reliability, and contributing to platform features supporting key customer-facing products. Collaboration with other teams is crucial to deliver high-impact capabilities. The ideal candidate possesses strong programming skills, experience with distributed data processing frameworks, and a familiarity with AWS and related tools. The position is remote, with occasional travel required.

Requirements

  • 5+ years of industry experience in backend or data engineering roles
  • Strong programming skills in Scala, Java, or a similar language
  • Solid experience with Apache Spark or other distributed data processing frameworks
  • Working knowledge of batch and stream processing architectures
  • Experience designing, building, and maintaining ETL/ELT pipelines in production
  • Familiarity with AWS and tools like Parquet, Delta Lake, or Kafka
  • Comfortable operating in a CI/CD environment with infrastructure-as-code and observability tools
  • Strong collaboration and communication skills

Responsibilities

  • Build and maintain scalable data pipelines using Spark, Scala, and cloud-native services
  • Improve the performance and reliability of our real-time and batch data processing systems
  • Contribute to platform features that support key customer-facing products such as Identity resolution, audience segmentation and real-time personalization
  • Work closely with Staff and Principal Engineers to execute on architectural decisions and implementation plans
  • Collaborate across product and engineering teams to deliver high-impact, customer-facing capabilities
  • Write clean, maintainable, and well-tested code that meets operational and compliance standards
  • Participate in code reviews, technical discussions, and incident response efforts to improve system quality and resiliency

Preferred Qualifications

  • Experience with Trino, Flink, or Snowflake
  • Familiarity with GDPR, CCPA, or other data governance requirements
  • Experience with high-scale event processing or identity resolution
  • Exposure to multi-region, fault-tolerant distributed systems

Benefits

  • Competitive pay
  • Generous time off
  • Ample parental and wellness leave
  • Healthcare
  • A retirement savings program

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.