Remote Lead Data Engineer

closed
Logo of Tide

Tide

πŸ“Remote - India

Job highlights

Summary

Join us in our mission to empower small businesses and help them save time and money. As a Data Engineer, you’ll be developing end-to-end ETL/ELT pipelines, designing scalable automated processes for data extraction, processing, and analysis, mentoring junior engineers, and more.

Requirements

  • Having 6+ years of extensive development experience using snowflake or similar data warehouse technology
  • Having working experience with dbt and other technologies of the modern data stack, such as Snowflake, Apache Airflow, Fivetran, AWS, git ,Looker
  • Experience in agile processes, such as SCRUM
  • Extensive experience in writing advanced SQL statements and performance tuning them
  • Experience in Data Ingestion techniques using custom or SAAS tool like fivetran
  • Experience in data modelling and can optimise existing/new data models
  • Experience in data mining, data warehouse solutions, and ETL, and using databases in a business environment with large-scale, complex datasets
  • Having experience architecting analytical databases (in Data Mesh architecture) is added advantage
  • You have experience working in agile cross-functional delivery team
  • You have high development standards, especially for code quality, code reviews, unit testing, continuous integration and deployment
  • You have strong technical documentation skills and the ability to be clear and precise with business users
  • You have business-level of English and good communication skills
  • You have basic understanding of various systems across the AWS platform ( Good to have )
  • Preferably, you have worked in a digitally native company, ideally fintech
  • Experience with python, governance tool (e.g. Atlan, Alation, Collibra) or data quality tool (e.g. Great Expectations, Monte Carlo, Soda) will be added advantage

Responsibilities

  • Developing end to end ETL/ELT Pipeline working with Data Analysts of business Function
  • Designing, developing, and implementing scalable, automated processes for data extraction, processing, and analysis in a Data Mesh architecture
  • Mentoring Fother Junior Engineers in the Team
  • Be a β€œgo-to” expert for data technologies and solutions
  • Ability to provide on the ground troubleshooting and diagnosis to architecture and design challenges
  • Troubleshooting and resolving technical issues as they arise
  • Looking for ways of improving both what and how data pipelines are delivered by the department
  • Translating business requirements into technical requirements, such as entities that need to be modelled, DBT models that need to be build, timings, tests and reports
  • Owning the delivery of data models and reports end to end
  • Perform exploratory data analysis in order to identify data quality issues early in the process and implement tests to ensure prevent them in the future
  • Working with Data Analysts to ensure that all data feeds are optimised and available at the required times. This can include Change Capture, Change Data Control and other β€œdelta loading” approaches
  • Discovering, transforming, testing, deploying and documenting data sources
  • Applying, help defining, and championing data warehouse governance: data quality, testing, coding best practices, and peer review
  • Building Looker Dashboard for use cases if required

Benefits

  • Competitive salary
  • Self & Family Health Insurance
  • Term & Life Insurance
  • OPD Benefits
  • Mental wellbeing through Plumm
  • Learning & Development Budget
  • WFH Setup allowance
  • 15 days of Privilege leaves
  • 12 days of Casual leaves
  • 12 days of Sick leaves
  • 3 paid days off for volunteering or L&D activities
  • Stock Options
This job is filled or no longer available

Similar Remote Jobs