Data Platform Engineer

BMLL Logo

BMLL

πŸ“Remote - United Kingdom

Summary

Join BMLL's Core Engineering team as a Data Platform Engineer and architect our core platform for executing complex data pipelines. You will design and build scalable solutions for millions of concurrent job executions, ensuring high availability and cost-effectiveness. Collaborate with development and operations teams to solve complex problems involving large data pipelines. We offer a combination of remote and London-based office working, along with share options, a discretionary bonus, and other benefits. The role requires strong Python skills, experience with AWS, and a computer science degree. Familiarity with distributed systems and DevOps practices is desirable.

Requirements

  • Industry experience with cloud computing tools and services in complex systems, preferably in AWS
  • Strong Python programming skills
  • Industry experience with software development lifecycle processes and tools
  • Experience working in a Linux environment
  • Experience with Docker
  • Experience with SQL and relational databases
  • Avid learner, problem solver and detail-orientated
  • Excellent teamwork and the ability to communicate and work in multidisciplinary teams in a collaborative manner
  • Computer science or other STEM degree
  • At least two years of industry experience

Responsibilities

  • Design and build solutions to scale AWS compute resources to meet application performance requirements
  • Ensure 24/7 system reliability by implementing company and industry best practices in replication, redundancy and monitoring
  • Implement workflow management software, to automate operational tasks and optimise the utilisation of infrastructure and applications
  • Design and implement CI/CD workflows to maintain software quality via continuous and automated deployment and testing
  • Work with development and operations teams to design solutions to complex problems, involving large data pipelines that process terabytes of historical market data, in the most efficient and cost-effective manner
  • Regularly review and assess new tools that become available in the industry and assess how they could be integrated into the platform to continuously improve

Preferred Qualifications

  • Familiarity with distributed systems concepts and tools, such as Spark, Ray, RabbitMQ, Kafka, AWS Batch
  • Familiarity with DevOps practices and tools, such as Terraform
  • Familiarity with job execution and orchestration tools, such as Celery and Airflow

Benefits

  • Competitive salary
  • 25 days holiday plus bank holidays
  • Share Options after completion of probationary period
  • Discretionary Bonus
  • Pension Scheme
  • Private Medical Insurance
  • Work remotely abroad for up to 40 business days each year
  • Life Insurance
  • Combination of remote and London-based office working (2-3 days in office per week)
  • A yearly Well being Physical Activity budget
  • Continuous learning through funded training and challenging projects
  • Collaborative culture
  • Weekly team lunches
  • Free Fruit, snacks, and drinks provided throughout the day (When office based)
  • Regular Team Socials
  • Cycle to Work Scheme

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.

Similar Remote Jobs