Data Pipeline Engineer

closed
YipitData Logo

YipitData

πŸ“Remote - India

Summary

Join YipitData's dynamic Data Engineering team in India as a Data Pipeline Engineer. This is a hybrid role (initially remote, transitioning to hybrid), offering a unique opportunity to be among the first hires in India. You will build and maintain end-to-end data pipelines, collaborate with stakeholders, and become an expert in solving complex data pipeline issues using PySpark and SQL. The ideal candidate has 3+ years of data engineering experience, a strong understanding of Spark and SQL, and a passion for solving data challenges. YipitData offers a competitive salary and comprehensive benefits, including vacation time, parental leave, and learning reimbursement, fostering a growth-oriented and inclusive work environment.

Requirements

  • Hold a Bachelor’s or Master’s degree in Computer Science, STEM, or a related technical discipline
  • Have 3+ years of experience as a Data Engineer or in other technical functions
  • Be excited about solving data challenges and learning new skills
  • Have a great understanding of working with data or building data pipelines
  • Be comfortable working with large-scale datasets using PySpark, Delta, and Databricks
  • Understand business needs and the rationale behind data transformations to ensure alignment with organizational goals and data strategy
  • Be eager to constantly learn new technologies
  • Be a self-starter who enjoys working collaboratively with stakeholders
  • Have exceptional verbal and written communication skills

Responsibilities

  • Report directly to the Director of Data Engineering
  • Build and maintain end-to-end data pipelines
  • Help with setting best practices for our data modeling and pipeline builds
  • Create documentation, architecture diagrams, and other training materials
  • Become an expert at solving complex data pipeline issues using PySpark and SQL
  • Collaborate with stakeholders to incorporate business logic into our central pipelines
  • Deeply learn Databricks, Spark, and other ETL toolings developed internally

Preferred Qualifications

Experience with Airflow, dbt, Snowflake, or equivalent

Benefits

  • Vacation time
  • Parental leave
  • Team events
  • Learning reimbursement
This job is filled or no longer available

Similar Remote Jobs