Data Engineer

Kpler Logo

Kpler

πŸ“Remote - Colombia

Summary

Join Kpler's Commodities Market Data team and contribute to the development and implementation of core data ingestion and distribution pipelines. Work alongside engineers to build back-end systems, troubleshoot workflows, and optimize performance. Ensure data integrity and accuracy across pipelines. The role requires proficiency in Python, data manipulation, data integration, and familiarity with containerization and orchestration tools. Experience with AWS and Terraform is a plus. Kpler offers a dynamic work environment focused on innovation and collaboration.

Requirements

  • Proficient in Python , with experience in data manipulation and transformation (such as with Pandas )
  • Knowledge of data integration, ETL processes, and batch/streaming data processing
  • An understanding of containerisation and orchestration tools (e.g., Docker , Kubernetes, Airflow )
  • Familiar with SQL , RDBMS or Big Data technologies
  • Comfortable working with Git , code reviews, and Agile methodologies

Responsibilities

  • Working alongside other engineers, take responsibility for the development and implementation of our core data ingestion and distribution pipelines, and associated back-end systems based on project requirements and design specifications
  • Help to troubleshoot and optimize existing workflows to improve performance and efficiency
  • Ensure data integrity and accuracy across various pipelines
  • Demonstrate strong analytical and debugging skills with a proactive approach to learning

Preferred Qualifications

Have worked with AWS (or another cloud provider), using Terraform

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.