Remote Big Data Engineer

closed
Logo of Nagarro

Nagarro

πŸ“Remote - Hungary

Job highlights

Summary

Join our dynamic and non-hierarchical work culture as a Data Engineer in the Business Analytics & Insights (BAI) team, collaborating with cross-functional teams to deliver high-quality data solutions.

Requirements

  • 4+ years of IT experience
  • Minimum of 4 years working with Azure Databricks
  • Proficiency in Data Modeling and Source System Analysis
  • Strong knowledge of PySpark and SQL
  • Experience with Azure components: Data Factory, Data Lake, SQL Data Warehouse (DW), and Azure SQL
  • Experience with Python for data engineering purposes
  • Ability to conduct data profiling, cataloging, and mapping for technical design and construction of data flows
  • Familiarity with data visualization/exploration tools

Responsibilities

  • Lead the technical planning for data migration, including data ingestion, transformation, storage, and access control within Azure Data Factory and Azure Data Lake
  • Design and implement scalable, efficient data pipelines to ensure smooth data movement from multiple sources using Azure Databricks
  • Develop reusable frameworks for the ingestion of large datasets
  • Ensure data quality and integrity by implementing robust validation and cleansing mechanisms throughout the data pipeline
  • Work with event-based/streaming technologies to ingest and process data in real-time
  • Provide technical support to the team, resolving challenges during the migration and post-migration phases
  • Stay current with the latest advancements in cloud computing, data engineering, and analytics technologies; recommend best practices and industry standards for data lake solutions
This job is filled or no longer available