Data Engineering Platform

Degreed Logo

Degreed

πŸ“Remote - India

Summary

Join Degreed's Data Engineering team and build the data infrastructure powering our innovative platform. You will collaborate with product and engineering teams, designing and implementing data pipelines, models, and solutions for reporting and analytics. Responsibilities encompass the entire data lifecycle, from metric definition and instrumentation to performance optimization and scalability. You will leverage technologies like Python, DBT, Snowflake, and Azure. The ideal candidate possesses a Bachelor's/Master's degree in a related field, 7+ years of data engineering experience, and expertise in data modeling, Python, SQL, and cloud technologies. Degreed offers a comprehensive benefits package, flexible work arrangements, and a commitment to diversity and inclusion.

Requirements

  • Bachelor's/Master's degree in Computer Science, Data Science, or a related field
  • 7+ years of experience in data engineering
  • Experience in Data Modelling for complex business models
  • Technical hands on with Python and SQL (DBT and Snowflake or other comparable warehouse like Redshift/Bigquery/Clickhouse)
  • Experience with one or more Clouds (AWS, Azure, GCP)
  • Good experience with Data Lakes and best practices involved in Data Lake architecture
  • Strong understanding of data governance principles and best practices
  • Excellent analytical, problem-solving, and communication skills

Responsibilities

  • Collaborate with product and engineering teams to define key performance indicators (KPIs) and implement robust logging and instrumentation strategies
  • Design and implement scalable data ingestion pipelines from diverse sources
  • Develop and maintain efficient and scalable data models (e.g., star schema, data vault) within data warehouses (Snowflake, Redshift, Clickhouse etc.,) and architecting and maintaining data lakes (Azure)
  • Implement complex data transformations using Python with DBT on multiple Data Sources and Targets
  • Establish and enforce data quality standards, implement data governance policies, and develop data documentation to ensure data accuracy, consistency, and reliability. Enable self-service analytics by providing clear documentation and training to stakeholders
  • Develop automated alerting systems to monitor data quality and identify anomalies. Create interactive dashboards and reports using visualization tools like Gooddata
  • Continuously optimize data pipelines and infrastructure for performance, scalability, and cost-efficiency, leveraging cloud-native technologies and best practices
  • Work autonomously and collaboratively within a cross-functional team, contributing technical expertise and driving data-driven decision-making. Lead technical projects, mentor junior team members (if applicable), and contribute to the team's technical roadmap

Benefits

  • Comprehensive benefits package designed to support your well-being, growth, and success
  • Flexible work arrangements

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.

Similar Remote Jobs