Data Engineer

epay, a Euronet Worldwide Company Logo

epay, a Euronet Worldwide Company

πŸ“Remote - United Kingdom

Summary

Join epay's global team of data experts and contribute to the development and optimization of data pipelines using Azure and Databricks. As a mid-level Data Engineer, you will work with diverse datasets across various industries, transforming complex data into actionable insights for internal and external stakeholders. You will build and maintain data pipelines, categorize and enrich datasets, automate processes using Python, and collaborate with data scientists and other teams. The role requires experience with Azure-based data services and proficiency in Python. Occasional global travel and flexibility across time zones are required. This remote position requires regular attendance at one of three UK locations. The ideal candidate will be proactive, innovative, and a strong team player.

Requirements

  • 2+ years of professional experience in a data engineering or similar role
  • Proficiency in Python , including use of libraries for data processing (e.g., pandas, pySpark)
  • Experience working with Azure-based data services , particularly Azure Databricks , Data Factory, and Blob Storage
  • Demonstrable knowledge of data pipeline orchestration and optimisation
  • Understanding of SQL for data extraction and transformation
  • Familiarity with source control, deployment workflows, and working in Agile teams
  • Strong communication and documentation skills, including translating technical work to non-technical stakeholders

Responsibilities

  • Data Pipeline Development: Build and maintain batch and streaming pipelines using Azure Data Factory and Azure Databricks
  • Data Categorisation & Enrichment: Structure unprocessed datasets through tagging, standardisation, and feature engineering
  • Automation & Scripting: Use Python to automate ingestion, transformation, and validation processes
  • ML Readiness: Work closely with data scientists to shape training datasets, applying sound feature selection techniques
  • Data Validation & Quality Assurance: Ensure accuracy and consistency across data pipelines with structured QA checks
  • Collaboration: Partner with analysts, product teams, and engineering stakeholders to deliver usable and trusted data products
  • Documentation & Stewardship: Document processes clearly and contribute to internal knowledge sharing and data governance
  • Platform Scaling: Monitor and tune infrastructure for cost-efficiency, performance, and reliability as data volumes grow
  • On-Call support : Participate in an on-call rota system to provide support for the production environment, ensuring timely resolution of incidents and maintaining system stability outside of standard working hours

Preferred Qualifications

  • Exposure to machine learning workflows or model preparation tasks
  • Experience working in a financial, payments, or regulated data environment
  • Understanding of monitoring tools and logging best practices (e.g., Azure Monitor, Log Analytics)
  • Awareness of cost optimisation and scalable design patterns in the cloud

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.