Remote Senior Staff Engineer, Big Data

Logo of Nagarro

Nagarro

πŸ“Remote - Sri Lanka

Job highlights

Summary

Join our dynamic and non-hierarchical work culture as a Digital Product Engineer! We're looking for someone with strong skills in Databricks, Python, and Azure cloud services. The ideal candidate will have at least 9 years of experience in the IT industry and hands-on experience in implementing data solutions on Databricks.

Requirements

  • Minimum 9 years of experience in the IT industry
  • Must have hands-on experience in Azure cloud services such as ADF, ADLS, Azure SQL, Logic Apps, and Azure Functions
  • Strong hands-on experience in Databricks and related architecture like Medallion
  • Experience in implementing and delivering data solutions and pipelines on Databricks
  • A strong understanding of data modeling, data structures, data warehouses, databases, and ETL processes
  • An in-depth understanding of large-scale data sets, including structured and unstructured data
  • Experience in SQL, Python, and PySpark

Responsibilities

  • Create documentation and knowledge bases for data pipelines, best practices, and solutions
  • A strong and positive team player

Preferred Qualifications

Good to have CI/CD and DevOps experience

Job description

Company Description

We are a Digital Product Engineering company that is scaling in a big way! We build products, services, and experiences that inspire, excite, and delight. We work at scale β€” across all devices and digital mediums, and our people exist everywhere in the world (15000+ experts across 26 countries, to be exact). Our work culture is dynamic and non-hierarchical. We are looking for great new colleagues. That is where you come in!

Job Description

Must have Skills : Databricks, Python (Strong), Azure Synapse,

Job Description : - Minimum 9 years of experience in the IT industry. - Must have hands-on experience in Azure cloud services such as ADF, ADLS, Azure SQL, Logic Apps, and Azure Functions. - Strong hands-on experience in Databricks and related architecture like Medallion. - Experience in implementing and delivering data solutions and pipelines on Databricks. - A strong understanding of data modeling, data structures, data warehouses, databases, and ETL processes. - An in-depth understanding of large-scale data sets, including structured and unstructured data. - Experience in SQL, Python, and PySpark. - Able to work under tight timelines and deliver on complex problems. - Create documentation and knowledge bases for data pipelines, best practices, and solutions. - A strong and positive team player. - Good to have CI/CD and DevOps experience.

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.
Please let Nagarro know you found this job on JobsCollider. Thanks! πŸ™