Big Data Engineer

Nagarro Logo

Nagarro

πŸ“Remote - Romania

Summary

Join Nagarro, a global digital product engineering company, and become a key member of our team. You will collaborate with cross-functional teams to deliver high-quality solutions in various domains. Key responsibilities include designing and implementing data pipelines using Azure Databricks, ensuring data quality, working with streaming technologies, and providing technical support. You will need 2+ years of data engineering experience, hands-on Azure Databricks experience, and proficiency in PySpark and SQL. Excellent communication and collaboration skills are essential. This role offers the opportunity to work on challenging projects and stay up-to-date with the latest technologies.

Requirements

  • 2+ years of experience working within Data Engineering field
  • Hands-on working experience with Azure Databricks
  • Experience in Data Modelling & Source System Analysis
  • Familiarity with PySpark
  • Mastery of SQL
  • Knowledge of components: Azure Data Factory, Azure Data Lake, Azure SQL DW, Azure SQL
  • Experience with Python programming language used for data Engineering purposes
  • Ability to conduct data profiling, cataloging, and mapping for technical design and construction of technical data flows
  • Experience in data visualization/exploration tools
  • Excellent communication skills, with the ability to effectively convey complex ideas to technical and non-technical stakeholders
  • Strong team player with excellent interpersonal and collaboration skills

Responsibilities

  • Be part of the technical plan for the migration, including data ingestion, transformation, storage, and access control in Azure's Data Factory and data lake
  • Design and implement scalable and efficient data pipelines to ensure smooth data movement from multiple sources using Azure Databricks
  • Developing scalable and re-usable frameworks for ingesting of data sets
  • Ensure data quality and integrity throughout the entire data pipeline, implementing robust data validation and cleansing mechanisms
  • Working with event based/streaming technologies to ingest and process data
  • Provide support to the team, resolving any technical challenges or issues that may arise during the migration and post-migration phases
  • Stay up to date with the latest advancements in cloud computing, data engineering, and analytics technologies, and recommend best practices and industry standards for implementing the data lake solution

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.

Similar Remote Jobs