Software Engineer

Twilio Logo

Twilio

πŸ’΅ $138k-$203k
πŸ“Remote - United States

Summary

Join Twilio as a Software Engineer on our Data & Analytics Platform, where you will design, build, and optimize our platform to support various data-driven initiatives. Collaborate with cross-functional teams to understand business needs and translate them into technical solutions using modern data technologies. You will architect and implement data solutions, ensuring data quality, integrity, and security. The ideal candidate will have a passion for leveraging data to drive business impact, strong technical skills, and experience with modern data technologies. This remote role offers competitive pay and benefits, including generous time off, parental and wellness leave, healthcare, and a retirement savings program. Occasional travel may be required for project or team meetings. The role is not eligible for candidates in CA, CT, NJ, NY, PA, or WA.

Requirements

  • Bachelor's or Master's degree in Computer Science, Engineering, or a related field
  • 5+ years of experience in software development or a related field
  • Strong proficiency in programming languages such as Python, Java, or Scala
  • Strong experience with building frameworks for big data technologies such as Spark, Kafka, Hive, and distributed computing systems
  • Experience with AWS technologies at scale
  • Solid understanding of software engineering principles, including object-oriented and functional programming paradigms, design patterns, and code quality practices
  • Excellent problem-solving and analytical skills
  • Strong verbal & written communication skills, with the ability to work effectively in a cross-functional team environment

Responsibilities

  • Design, build, and maintain infrastructure and scalable frameworks to support data ingestion, processing, and analysis
  • Collaborate with stakeholders, analysts, and product teams to understand business requirements and translate them into technical solutions
  • Architect and implement data solutions using modern data technologies such as Kafka, Spark, Hive, Hudi, Presto, Airflow, and cloud-based services like AWS Lakeformation, Glue and Athena
  • Design and implement frameworks and solutions for performance, reliability, and cost-efficiency
  • Ensure data quality, integrity, and security throughout the data lifecycle
  • Stay current with emerging technologies and best practices in big data technologies
  • Mentor early in career engineers and contribute to a culture of continuous learning and improvement

Preferred Qualifications

  • Bias to action, ability to iterate and ship rapidly
  • Passion to build data products, prior projects in this area

Benefits

  • Health care insurance
  • 401(k) retirement account
  • Paid sick time
  • Paid personal time off
  • Paid parental leave

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.