Senior Data Engineer

Logo of accesa.eu

accesa.eu

📍Remote - Romania

Job highlights

Summary

Join Accesa, a leading technology company, and contribute to the optimization of data management strategies for a prominent financial client. This role involves migrating data warehouse models into data products within the Data Integration Hub (DIH). You will be responsible for creating and maintaining data transformation pipelines, working with large financial datasets, and leading process improvements. The position requires extensive experience in big data technologies, ETL processes, and CI/CD pipelines. Accesa offers a comprehensive benefits program focusing on physical, emotional, social, and work-life fusion well-being.

Requirements

  • Have 5+ years of experience in a similar role, preferably within Agile teams
  • Be skilled in SQL and relational databases for data manipulation
  • Have experience in building and optimizing Big Data pipelines and architectures
  • Have familiarity with innovative technologies in message queuing, stream processing, and scalable big data storage solutions
  • Have knowledge of Apache Spark framework and object-oriented programming in Java; experience with Python is a plus
  • Have proven experience in performing data analysis and root cause analysis on diverse datasets to identify opportunities for improvement
  • Have experience with ETL processes, including scheduling and orchestration using tools like Apache Airflow (or similar)
  • Automate CI/CD pipelines using ArgoCD, Tekton, and Helm to streamline deployment and improve efficiency across the SDLC
  • Manage Kubernetes deployments (e.g. OpenShift), focusing on scalability, security, and optimized container orchestration
  • Have strong analytical skills in working with both structured and unstructured data

Responsibilities

  • Create and maintain optimal data transformation pipelines
  • Work with large, complex financial data sets to generate outputs that meet functional and non-functional business requirements
  • Identify, design, and implement process improvements such as automating manual processes, optimizing data delivery, and re-designing infrastructure for higher scalability
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using open-source technologies
  • Build/use analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency, and other key business performance metrics
  • Work with clients and internal stakeholders, including Senior Management, Departments Heads, Product, Data, and Design teams, to assist with data-related technical issues and support their data infrastructure needs

Preferred Qualifications

  • Have expertise in processing large, disconnected datasets to extract actionable insights
  • Have technical skills in the following areas: relational databases (e.g. PostgreSQL), Big Data Tools: (e.g. Databricks), and workflow management (e.g. Airflow), and backend development using Spring Boot

Benefits

  • Premium medical package for both our colleagues and their children
  • Dental coverage up to a yearly amount
  • Eyeglasses reimbursement every two years
  • Voucher for sport equipment expenses
  • In-house personal trainer
  • Individual therapy sessions with a certified psychotherapist
  • Webinars on self-development topics
  • Virtual activities
  • Sports challenges
  • Special occasions get-togethers
  • Yearly increase in days off
  • Flexible working schedule
  • Birthday, holiday and loyalty gifts for major milestones

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.
Please let accesa.eu know you found this job on JobsCollider. Thanks! 🙏