Software Engineer

Twilio Logo

Twilio

πŸ’΅ $114k-$168k
πŸ“Remote - United States

Summary

Join Twilio as a Software Engineer on our Data & Analytics Platform and play a crucial role in designing, building, and optimizing our platform to support various data-driven initiatives. Collaborate with cross-functional teams to understand business requirements, architect scalable solutions, and implement data solutions and infrastructure for our Data Platform. The ideal candidate will have a passion for leveraging data to drive business impact, strong technical skills, and experience with modern data technologies. You will design, build, and maintain infrastructure and scalable frameworks to support data ingestion, processing, and analysis. You will also collaborate with stakeholders to translate business requirements into technical solutions and ensure data quality, integrity, and security. Mentoring early-career engineers and contributing to a culture of continuous learning and improvement are also key aspects of this role.

Requirements

  • Bachelor's or Master's degree in Computer Science, Engineering, or a related field
  • 3+ years of experience in software development or a related field
  • Strong proficiency in programming languages such as Python, Java, or Scala
  • Strong experience with building frameworks for big data technologies such as Spark, Kafka, Hive, and distributed computing systems
  • Experience with AWS technologies at scale
  • Solid understanding of software engineering principles, including object-oriented and functional programming paradigms, design patterns, and code quality practices
  • Excellent problem-solving and analytical skills
  • Strong verbal & written communication skills, with the ability to work effectively in a cross-functional team environment

Responsibilities

  • Design, build, and maintain infrastructure and scalable frameworks to support data ingestion, processing, and analysis
  • Collaborate with stakeholders, analysts, and product teams to understand business requirements and translate them into technical solutions
  • Architect and implement data solutions using modern data technologies such as Kafka, Spark, Hive, Hudi, Presto, Airflow, and cloud-based services like AWS Lakeformation, Glue and Athena
  • Design and implement frameworks and solutions for performance, reliability, and cost-efficiency
  • Ensure data quality, integrity, and security throughout the data lifecycle
  • Stay current with emerging technologies and best practices in big data technologies
  • Mentor early in career engineers and contribute to a culture of continuous learning and improvement

Preferred Qualifications

  • Bias to action, ability to iterate and ship rapidly
  • Passion to build data products, prior projects in this area

Benefits

  • Health care insurance
  • 401(k) retirement account
  • Paid sick time
  • Paid personal time off
  • Paid parental leave

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.