Remote Data Engineer
Twilio
πRemote - Ireland
Please let Twilio know you found this job on JobsCollider. Thanks! π
Job highlights
Summary
Join Twilio's Data & Analytics Platform team as a Data Engineer. You will design and build data platforms and services, manage data infrastructure in cloud environments, and contribute to strategic business decisions. This role requires collaboration with various teams and participation in Agile/Scrum activities. You'll need a Bachelor's degree in Computer Science or equivalent experience, along with 3+ years of data engineering experience and proficiency in specific technologies and methodologies. Opportunities to learn new technologies and skills are abundant. Twilio offers a remote-first work environment and various benefits.
Requirements
- Bachelor's degree in Computer Science required, or equivalent experience
- 3+ years of data engineering experience
- Experience with SQL
- Experience with one or more of Python, Java, Scala
- Experience with Cloud Computing platforms (e.g. AWS, GCP, Azure)
- Experience with Transformation/ETL (e.g. dbt, Spark SQL/Scala/PySpark, AWS Glue)
- Experience with Distributed databases (e.g. Athena, Presto, Hive, Snowflake, Redshift)
- Experience with one or more of the following: Dimensional Modeling: Star/snowflake schema, Lakehouse/medallion (bronze, silver, gold), Data Vault
- Collaborative mindset and ability to work with distributed, cross-functional teams
- Solid communication skills and the ability to clearly articulate your point of view
Responsibilities
- Generate highly scalable platforms and services to support rapidly growing data needs at Twilio
- Identify opportunities for improvement in our data infrastructure and implement solutions that improve efficiency and scalability
- Cultivate a culture of collaboration and innovation within the Data & Analytics Platform team
- Collaborate with data architects, product managers, and other engineers to ensure seamless integration of data platforms and services into Twilio's overall technology stack
- Participate in Agile/Scrum activities, including planning, stand-ups, retrospectives, and provide a point of view on user stories
- Develop and maintain technical documentation to ensure ease of use and adoption of data platforms and services
Preferred Qualifications
- Master's degree in Computer Science
- Experience with Containerisation (e.g. Docker, Kubernetes)
- Experience with CI / CD (e.g. Buildkite, ArgoCD)
- Experience with Ingestion systems (e.g. Kafka, Debezium, Meltano)
- Experience with Big data file formats (e.g. Hudi, Parquet, Iceberg, Delta)
- Experience with Relational / OLTP Databases (e.g. Postgresql, Aurora / MySQL)
- Experience with OLAP Databases (e.g. Clickhouse, Druid, Pinot)
- Experience with Generative AI (e.g. LLM, Langchain, Bedrock)
- Experience working in an agile environment and iterative development
- Excellent written and verbal communication skills
- Ability to influence and build effective working relationships with all levels of the organization
Benefits
- Competitive pay
- Generous time-off
- Ample parental and wellness leave
- Healthcare
- A retirement savings program
Share this job:
Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.
Similar Remote Jobs
- π°$220k-$270kπUnited States
- πUnited States
- πKingdom of Saudi Arabia
- π°$175k-$210kπUnited States, Worldwide
- πIndia
- πIndia
- πIndia
- πWorldwide
- π°$225k-$255kπUnited States
- πMexico
Please let Twilio know you found this job on JobsCollider. Thanks! π