Senior Data Engineer-Tech Lead
Zealogics Inc
πRemote - Worldwide
Please let Zealogics Inc know you found this job on JobsCollider. Thanks! π
Job highlights
Summary
Join our team as a Data Engineer with 9+ years of experience, including 5+ years in Azure technologies. You will participate in business discussions, gather data requirements, and lead a team to address data challenges. Proficiency in SQL, PySpark, Azure Data Factory, and Databricks is essential. You will design and implement data pipelines, optimize data workflows, and ensure data governance. Strong communication and collaboration skills are required, along with a willingness to learn and adapt. Experience with CI/CD tools like Jenkins and Azure DevOps is a plus.
Requirements
- 9+ years of overall experience with more than 5 years of expertise in Azure technologies
- Proficiency in writing complex SQL queries for data extraction, transformation, and analysis
- Knowledge of SQL functions, joins, subqueries, and performance tuning
- Hands on experience with PySQL/Pyspark etc
- Hands on Experience in creating and managing data pipelines using Azure Data Factory
- Knowledge of data engineering workflows and best practices in Databricks
- Proficiency in using Git for version control and collaboration in data projects
- Clear and effective communication skills to articulate findings and recommendations for other team members
- Ability to document processes, workflows, and data analysis results effectively
- Willingness to learn new tools, technologies, and techniques as the field of data analytics evolves
- Being adaptable to changing project requirements and priorities
Responsibilities
- Participate in business discussions and assist gathering data requirements
- Address data challenges by leading a team
- Write complex SQL queries for data extraction, transformation, and analysis
- Navigate source systems to understand how data is related and use data profiling to gain a better understanding of the data
- Create and manage data pipelines using Azure Data Factory
- Understand data integration, transformation, and workflow orchestration in Azure environments
- Understand data engineering workflows and best practices in Databricks
- Understand existing templates and patterns for development
- Use Unity Catalog and Databricks workflow
- Use Git for version control and collaboration in data projects
- Work effectively in a team environment, especially in agile or collaborative settings
- Articulate findings and recommendations for other team members
- Document processes, workflows, and data analysis results effectively
- Learn new tools, technologies, and techniques as the field of data analytics evolves
- Adapt to changing project requirements and priorities
- Envision/lead E2E solution and solve technical issues during offshore
- Design and implement data pipelines using Databricks, Spark
- Optimize data workflows and predictive modeling
- Develop expertise in batch and streaming data solutions, automating workflows with CI/CD tools like Jenkins and Azure DevOps, and ensuring data governance with Delta Lake
Preferred Qualifications
- Expertise in optimizing data workflows and predictive modeling
- Azure Databricks, Data Lakehouse architectures, and Azure Data Factory
- Spark, PySpark, Delta Lake, Azure DevOps, Python
Share this job:
Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.
Please let Zealogics Inc know you found this job on JobsCollider. Thanks! π