Senior Data Engineer 3

Twilio Logo

Twilio

πŸ“Remote - Colombia

Summary

Join Twilio's GTM Data Engineering team as a Senior Data Engineer 3! You will architect and maintain data infrastructure supporting sales operations, connecting GTM systems with the data lake. Responsibilities include designing scalable data pipelines, ensuring seamless integration between sales platforms and analytics infrastructure, and building automated solutions for data quality. You will also create data models, manage reverse ETL pipelines, and partner with AI/ML engineers. The role requires 5+ years of experience in developing scalable data pipeline infrastructure, preferably for sales organizations, and expertise in big data processing frameworks, data orchestration tools, and infrastructure-as-code. This remote position, based in Colombia, may require occasional travel.

Requirements

  • 5+ years of experience in developing scalable data pipeline infrastructure, preferably for sales organizations
  • Proven track record of delivering large-scale data projects and working with business partners
  • Experience with big data processing frameworks such as Apache Spark
  • Experience with data orchestration tools like Airflow or Dagster
  • Experience with infrastructure-as-code tools (e.g., Terraform) and modern CI/CD pipelines

Responsibilities

  • Collaborate with other engineers, business partners, and data scientists to build best-in-class data infrastructure that meets evolving needs
  • Design and implement scalable data pipelines that integrate Salesforce and other sales systems data into our enterprise data lake
  • Build automated solutions for sales data quality, enrichment, and standardization
  • Create and maintain data models that power sales analytics, forecasting, and reporting systems
  • Design and manage reverse ETL pipelines to power sales operations and marketing automation
  • Partner with AI/ML engineers to develop Sales predictive and generative models
  • Architect solutions for real-time sales data synchronization and processing
  • Optimize data flows between Salesforce, Snowflake, AWS Athena, and other enterprise systems
  • Build robust monitoring and alerting systems for sales data pipelines
  • Collaborate with Sales Operations to automate manual processes and improve data accuracy
  • Create documentation and enable self-service capabilities for sales teams

Preferred Qualifications

  • Experience with Salesforce data architecture and APIs
  • Experience with Python, Go, or/and Java
  • Experience with data modeling (dbt) in a data warehouse (Snowflake)
  • Experience on building large scale distributed systems in AWS or a similar cloud provider
  • Experience with LLMs and/or other AI technologies

Benefits

  • Generous time-off
  • Ample parental and wellness leave
  • Healthcare
  • A retirement savings program

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.

Similar Remote Jobs