Data Engineer (with Spark, Airflow)
accesa.eu
πRemote - Romania
Please let accesa.eu know you found this job on JobsCollider. Thanks! π
Job highlights
Summary
Join our team as a Data Engineer to drive data efficiency, master complex data handling, lead innovation and process optimization, architect scalable data infrastructure, unlock actionable insights, and collaborate with cross-functional teams.
Requirements
- Must have 3+ years of experience in a similar role, preferably within Agile teams
- Strong analytical skills in working with both structured and unstructured data
- Skilled in SQL and relational databases for data manipulation
- Experience in building and optimizing Big Data pipelines and architectures
- Knowledge of Apache Spark framework and object-oriented programming in Java; experience with Python is a plus
- Experience with ETL processes, including scheduling and orchestration using tools like Apache Airflow (or similar)
- Proven experience in performing data analysis and root cause analysis on diverse datasets to identify opportunities for improvement
Responsibilities
- Drive Data Efficiency: Create and maintain optimal data transformation pipelines
- Master Complex Data Handling: Work with large, complex financial data sets to generate outputs that meet functional and non-functional business requirements
- Lead Innovation and Process Optimization: Identify, design, and implement process improvements such as automating manual processes, optimizing data delivery, and re-designing infrastructure for higher scalability
- Architect Scalable Data Infrastructure: Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using open-source technologies
- Unlock Actionable Insights: Build/use analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency, and other key business performance metrics
- Collaborate with Cross-Functional Teams: Work clients and internal stakeholders, including Senior Management, Departments Heads, Product, Data, and Design teams, to assist with data-related technical issues and support their data infrastructure needs
Preferred Qualifications
- Expertise in manipulating and processing large, disconnected datasets to extract actionable insights
- Automate CI/CD pipelines using ArgoCD, Tekton, and Helm to streamline deployment and improve efficiency across the SDLC
- Manage Kubernetes deployments on OpenShift, focusing on scalability, security, and optimized container orchestration
Benefits
- Physical: premium medical package for both our colleagues and their children, dental coverage up to a yearly amount, eyeglasses reimbursement every two years, voucher for sport equipment expenses, in-house personal trainer
- Emotional: individual therapy sessions with a certified psychotherapist, webinars on self-development topics
- Social: virtual activities, sports challenges, special occasions get-togethers
- Work-life fusion: yearly increase in days off, flexible working schedule, birthday, holiday and loyalty gifts for major milestones
Share this job:
Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.
Similar Remote Jobs
- πArgentina
- πWorldwide
- πSpain
- πUnited States
- πRomania
- π°$185k-$223kπUnited States
- πUnited States
- π°$76kπUnited Kingdom
- π°$145k-$170kπUnited States
Please let accesa.eu know you found this job on JobsCollider. Thanks! π