
Data Engineer

EUROPEAN DYNAMICS
Summary
Join EUROPEAN DYNAMICS (ED) as a Data Engineer in Valletta, Malta, working remotely or on-site at customer premises. You will be part of the Development team, collaborating with a major client's IT team. Your responsibilities include designing, developing, documenting, and maintaining ETL/ELT processes, data integration, cleaning, transformation, dissemination, and automation processes. You will also design, develop, document, and maintain data architecture, data modelling, and metadata. Additionally, you will develop and support data warehouse/lakehouse architectures and data processing, ensuring data quality, lineage, auditing, metadata, logging, linkage across datasets, and impact assessments. You will also develop and maintain business intelligence models, interactive dashboards, reports, and analytics using tools such as Databricks, Jupyter Notebooks, and Power BI. Furthermore, you will contribute to the definition and documentation of data governance policies, procedures, standards, and metadata models.
Requirements
- University degree in IT or relevant discipline, combined with minimum 6 years of relevant working experience in IT
- Experience with development and data processing using e.g. Python, SQL, Power M and DAX
- Experience with structured, semi-structured and unstructured data types and related file format (e.g. JSON, Parquet, Delta)
- Experience with gathering business requirements and transforming it into data collection, integration and analysis processes
- Experience in Microsoft On-Prem and Azure Data Platform tools (such as Azure Data Factory, Azure Functions, Azure Logic Apps, SQL Server, ADLS, Azure Databricks, Microsoft Fabric/Power BI, Azure DevOps, Azure AI Services, PowerShell)
- Experience in CI/CD lifecycle using Azure DevOps
- Experience in Databricks ecosystem, Apache Spark and Python data processing libraries
- Experience with Data Modelling principles and methods
- Experience with Data Lakes and Data Lakehouse architecture, concepts and governance
- Experience with Data Integration and data warehouse/lakehouse modelling techniques, concepts and methods (e.g. SCD, Functional Engineering, Data Vault, Data Streaming, etc)
- Experience with data governance and data management standards, policies, processes, metadata, quality, etc
- Experience with WebAPIs and OpenAPI standard
- Knowledge of DAMA Data Management best practices and standards
- Knowledge of Data Governance and Discovery tools such as Azure Purview
- Knowledge of Master data and reference data management concepts
- Knowledge of Business glossaries, data dictionaries, and data catalogues
- Knowledge of Moodle or other Learning Management System
- Excellent command of the English language
Responsibilities
- Design, develop, document, and maintain ETL/ELT processes, data integration, cleaning, transformation, dissemination and automation processes
- Design, develop, document and maintain data architecture, data modelling and metadata
- Develop and support data warehouse/lakehouse architectures and data processing ensuring data quality, lineage, auditing, metadata, logging, linkage across datasets and impact assessments
- Develop and maintain business intelligence models, interactive dashboards, reports and analytics using tools such as Databricks, Jupyter Notebooks, and Power BI
- Design, develop, document, improve and maintain the Data Warehouse/Lakehouse ecosystem (e.g. the DataDevOps lifecycle, architecture)
- Contribute to the definition and documentation of data governance policies, procedures, standards, and metadata models
Benefits
We offer a competitive remuneration (either on contract basis or remuneration with full benefits package), based on qualifications and experience
Share this job:
Similar Remote Jobs
