Senior Data Engineer

Instacart Logo

Instacart

๐Ÿ’ต $115k-$128k
๐Ÿ“Remote - Canada

Summary

Join Instacart's Catalog data engineering team and play a critical role in defining how catalog data is structured and standardized for consistent, reliable, timely, and accurate product information. This high-impact role involves owning essential data integration pipelines and models across Instacartโ€™s offerings, enabling efficient, high-quality data workflows. You will work closely with engineers and stakeholders, owning a large part of the process from problem understanding to solution delivery. You'll ship high-quality, scalable, and robust solutions, suggest and drive organization-wide initiatives, and have a large scope for company-level impact. Instacart offers a flexible work environment and highly competitive compensation and benefits.

Requirements

  • 6+ years of working experience in a Data/Software Engineering role, with a focus on building data pipelines
  • Expert with SQL and knowledge of Python
  • Experience building high quality ETL/ELT pipelines
  • Past experience with data immutability, auditability, slowly changing dimensions or similar concepts
  • Experience building data pipelines for accounting/billing purposes
  • Experience with cloud-based data technologies such as Snowflake, Databricks, Trino/Presto, or similar
  • Adept at fluently communicating with many cross-functional stakeholders to drive requirements and design shared datasets
  • A strong sense of ownership, and an ability to balance a sense of urgency with shipping high quality and pragmatic solutions
  • Experience working with a large codebase on a cross functional team

Responsibilities

  • Build and maintain high-quality, scalable, and robust data pipelines for catalog data
  • Work closely with engineers and both internal and external stakeholders
  • Own a large part of the process from problem understanding to shipping the solution
  • Ship high-quality, scalable and robust solutions with a sense of urgency
  • Suggest and drive organization-wide initiatives

Preferred Qualifications

  • Bachelorโ€™s degree in Computer Science, computer engineering, electrical engineering OR equivalent work experience
  • Experience with Snowflake, dbt (data build tool) and Airflow
  • Experience with building Flink pipelines
  • Experience with data quality monitoring/observability

Benefits

  • Highly market-competitive compensation and benefits
  • Remote work
  • New hire equity grant as well as annual refresh grants

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.

Similar Remote Jobs