Python Data Engineer

DRW Logo

DRW

πŸ“Remote - United Kingdom

Summary

Join DRW's Weather team as a skilled Python Data Engineer. Design, develop, and maintain efficient and scalable data pipelines using Python. Extract, transform, and load (ETL) data from various sources. Collaborate with data scientists and analysts to deliver high-quality data solutions. Monitor and optimize pipeline performance, ensuring data quality and reliability. Implement data validation and error handling. Work with cloud-based data storage and processing solutions. Stay current with industry trends and technologies.

Requirements

  • Bachelor's degree in Computer Science, Engineering, Information Technology, or a related field
  • Minimum of 3 years of experience in data engineering or a related role
  • Proficiency in Python programming and experience with libraries such as Pandas, NumPy, and FastAPI
  • Experience with weather and climate datasets and tooling (e.g., Copernicus, Xarray, Zarr, NetCDF)
  • Experience with ETL tools and frameworks (e.g., Apache Airflow, Apache NiFi, Talend)
  • Strong understanding of relational databases and SQL
  • Experience with cloud platforms (e.g., AWS, GCP, Azure) and their data services
  • Familiarity with data warehousing solutions (e.g., Redshift, BigQuery, Snowflake)
  • Experience with version control systems (e.g., Git)
  • Strong problem-solving skills and attention to detail
  • Excellent communication and collaboration skills

Responsibilities

  • Design, develop, and maintain efficient and scalable data pipelines using Python
  • Extract, transform, and load (ETL) data from various sources into our data platform
  • Collaborate with data scientists and analysts to understand data requirements and deliver high-quality data solutions
  • Monitor and optimize the performance of data pipelines to ensure data quality and reliability
  • Implement data validation and error-handling mechanisms to ensure data accuracy
  • Work with cloud-based data storage and processing solutions (e.g., AWS, GCP, Azure)
  • Stay up-to-date with industry trends and emerging technologies to continuously improve our data engineering capabilities

Preferred Qualifications

  • Experience with containerization and orchestration tools (e.g., Docker, Kubernetes)
  • Knowledge of big data technologies (e.g., Hadoop, Spark)
  • Experience in commodities (Agriculture, Natural Gas, Power)

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.