Staff Data Engineer

Logo of Twilio

Twilio

πŸ’΅ $149k-$197k
πŸ“Remote - United States

Job highlights

Summary

Join Twilio's Segment product team as a Staff Data Engineer. This role is the backbone of data-driven decisions, requiring you to partner with stakeholders, gather requirements, and extract value from data. You will design, build, and maintain data pipelines processing terabyte-scale datasets using batch and streaming techniques. Responsibilities include optimizing the data warehouse (Snowflake), building data-driven processes, and collaborating with data scientists. The position requires extensive experience in data engineering, data warehousing, and related technologies. While remote, it's not eligible for certain California areas. Twilio offers competitive pay and benefits, including generous time off, parental and wellness leave, healthcare, and a retirement savings program.

Requirements

  • 7+ years of experience in data engineering or related fields, with a strong focus on designing and building scalable data systems
  • Experience in designing scalable data warehouses and working with modern data warehousing solutions, such as Snowflake
  • Experience with data orchestration tools like Airflow and dbt, with a solid understanding of data modeling and ETL principles
  • Experience with infrastructure-as-code tools (e.g., Terraform) and modern CI/CD pipelines
  • Proven track record of delivering large-scale data projects and working in cross-functional teams
  • Self starter, ability to work independently and autonomously, as well as part of a team

Responsibilities

  • Design, build, and maintain data pipelines that collect, process, and transform large volumes of data from various sources into a format suitable for analysis
  • Develop and maintain our data warehouse (Snowflake) to enable efficient and accurate analysis of data
  • Document data pipelines, data models, and data transformation processes
  • Collaborate with cross-functional teams to identify and understand data requirements for various business needs
  • Work with data scientists to build our internal machine learning infrastructure

Preferred Qualifications

  • Experience on building large scale distributed systems in AWS
  • Experience with Python, Go, or/and Java
  • Experience with streaming technology stack, such as Kafka or Kinesis
  • Experience with managing and deploying machine learning models

Benefits

  • Competitive pay
  • Generous time-off
  • Ample parental and wellness leave
  • Healthcare
  • A retirement savings program
  • Health care insurance
  • 401(k) retirement account
  • Paid sick time
  • Paid personal time off
  • Paid parental leave

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.
Please let Twilio know you found this job on JobsCollider. Thanks! πŸ™