Data Engineer

Adaptiq
Summary
Join Adaptiq, a technology hub supporting R&D teams, as a Senior Data Platform Engineer to contribute to Finaloop's data infrastructure. You will design, build, and maintain scalable data pipelines and ETL processes for a real-time financial accounting platform. This role involves developing and optimizing data infrastructure for analytics and reporting, implementing data governance and security, and collaborating with data scientists. You will troubleshoot data infrastructure issues, monitor system performance, and stay updated on emerging technologies. The position requires strong Python and SQL skills, experience with various platforms and tools, and a deep understanding of data engineering principles. Adaptiq offers a fully remote work model, competitive compensation, and 20 days of vacation leave.
Requirements
- 5+ years experience in Data Engineering or Platform Engineering roles
- Strong programming skills in Python and SQL
- Experience with orchestration platforms and tools (Airflow, Dagster, Temporal or similar)
- Experience with MPP platforms (e.g., Snowflake, Redshift, Databricks)
- Hands-on experience with cloud platforms (AWS) and their data services
- Understanding of data modeling, data warehousing, and data lake concepts
- Ability to optimize data infrastructure for performance and reliability
- Experience working with containerization (Docker) in Kubernetes environments
- Familiarity with CI/CD concepts and principles
- Fluent English (written and spoken)
Responsibilities
- Designing, building, and maintaining scalable data pipelines and ETL processes for our financial data platform
- Developing and optimizing data infrastructure to support real-time analytics and reporting
- Implementing data governance, security, and privacy controls to ensure data quality and compliance
- Creating and maintaining documentation for data platforms and processes
- Collaborating with data scientists and analysts to deliver actionable insights to our customers
- Troubleshooting and resolving data infrastructure issues efficiently
- Monitoring system performance and implementing optimizations
- Staying current with emerging technologies and implementing innovative solutions
Preferred Qualifications
- Experience with big data processing frameworks (Apache Spark, Hadoop)
- Experience with stream processing technologies (Flink, Kafka, Kinesis)
- Knowledge of infrastructure as code (Terraform)
- Experience building analytics platforms or clickstream pipelines
- Familiarity with ML workflows and MLOps
- Experience working in a startup environment or fintech industry
Benefits
- 20 days of vacation leave per calendar year (plus official national holidays of a country you are based in)
- Full accounting and legal support in all countries we operate
- Fully remote work model with a powerful workstation and co-working space in case you need it
- A highly competitive package with yearly performance and compensation reviews