Summary
Join Twilio as a Staff Data Engineer (L4) and play a key role in building and maintaining data infrastructure for our Customer Data Platform (CDP). You will contribute to the design and implementation of high-volume data pipelines, collaborate with engineers, and ensure the platform's robustness and scalability. This role involves handling high-throughput data ingestion, processing, and enrichment. It's ideal for someone with a strong data engineering background ready for broader responsibilities and impacting customer experience. You will build high-performance systems, solve data challenges, and make a measurable impact. Twilio offers a remote-first culture with competitive benefits.
Requirements
- 7+ years of industry experience in backend or data engineering roles
- Strong programming skills in Scala, Java, or a similar language
- Solid experience with Apache Spark or other distributed data processing frameworks
- Experience with Trino, Snowflake, Delta Lake and comfortable working with ecommerce-scale datasets
- Working knowledge of batch and stream processing architectures
- Experience designing, building, and maintaining ETL/ELT pipelines in production
- Familiarity with AWS and tools like Parquet, Delta Lake, or Kafka
- Comfortable operating in a CI/CD environment with infrastructure-as-code and observability tools
- Strong collaboration and communication skills
Responsibilities
- Build and maintain scalable data pipelines using Spark, Scala, and cloud-native services
- Improve the performance and reliability of our real-time and batch data processing systems
- Contribute to platform features that support key customer-facing products such as Identity resolution, audience segmentation and real-time personalization
- Work closely with Staff and Principal Engineers to execute on architectural decisions and implementation plans
- Collaborate across product and engineering teams to deliver high-impact, customer-facing capabilities
- Write clean, maintainable, and well-tested code that meets operational and compliance standards
- Participate in code reviews, technical discussions, and incident response efforts to improve system quality and resiliency
Preferred Qualifications
- Familiarity with GDPR, CCPA, or other data governance requirements
- Experience with high-scale event processing or identity resolution
- Exposure to multi-region, fault-tolerant distributed systems
Benefits
- Competitive salary
- Equity
- Generous time-off
- Healthcare
- Wellness leave
- Supportive remote-first culture
Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.