Staff Data Engineer

closed
Route Logo

Route

πŸ’΅ $176k-$198k
πŸ“Remote - United States

Summary

Join Route's Engineering team as a Staff Data Engineer and play a pivotal role in maintaining and expanding our data infrastructure. This highly autonomous role requires a seasoned engineer with strong technical skills to ensure the continuous and reliable ingestion of data. You will design and implement new data integrations, work cross-functionally with various teams, and leverage technologies like Snowflake, Databricks, and AWS. The position demands expertise in data processing, pipeline development, and cloud infrastructure. Route offers a comprehensive benefits package including 100% health insurance premium coverage, remote work options, unlimited PTO, 401k matching, and professional development opportunities.

Requirements

  • 5+ years of experience in data engineering or a similar field
  • Expertise in Snowflake, DBT, Databricks, and Spark for data processing and analytics
  • Strong programming skills in Python for data pipeline development and automation
  • Experience with Snaplogic, Airflow, or similar ETL orchestration tools
  • Proficiency in cloud infrastructure and data services within the AWS ecosystem
  • Hands-on experience with Terraform for infrastructure as code
  • Strong problem-solving skills with the ability to work independently in a fast-paced environment
  • Excellent communication skills and ability to collaborate across teams
  • Familiarity with real-time data processing architectures
  • Knowledge of data governance, security, and compliance best practices

Responsibilities

  • Own and maintain the reliability of existing data ingestion pipelines, ensuring uninterrupted data flow
  • Design, develop, and optimize new data ingestion and integration pipelines
  • Work cross-functionally with data analysts, product managers, back-end and front-end engineering teams, and business stakeholders to deliver robust data solutions
  • Leverage Snowflake, Databricks, DBT, and Spark to build scalable and high-performance data processing solutions
  • Implement and maintain Snaplogic, Airflow, and Terraform for ETL orchestration, workflow automation, and infrastructure as code
  • Ensure high availability, scalability, and security of data infrastructure using the AWS suite
  • Troubleshoot and optimize performance across the data pipeline stack
  • Stay ahead of industry trends and continuously improve data engineering best practices
  • Partner with Data Analysts to enhance analytics and provide guidance on queries that impact multiple departments

Preferred Qualifications

  • Experience with AI agents to interact with a database
  • Experience in the e-commerce or post-purchase space is a plus

Benefits

  • 100% of your health insurance premiums on a $0 deductible plan for you and your family
  • Remote or hybrid work arrangements
  • Unlimited PTO
  • 401k matching
  • Formalized growth opportunities
  • Learning & development
  • DEI programs & events
This job is filled or no longer available

Similar Remote Jobs