Senior Data Engineer

Auros Logo

Auros

πŸ“Remote - United Kingdom

Summary

Join Auros, a leading algorithmic trading and market-making firm, as a Data Engineer to champion the firm's market and trading data archives and internal data products. You will work with existing data pipelines and databases while designing and implementing the next generation of Auros data and analytic capabilities. This role offers the opportunity to make a substantial impact on business outcomes by developing, testing, and maintaining high-throughput, high-volume distributed data architectures. You will collaborate with traders and trading system developers to improve data quality and develop tools for easy data access. The position requires experience with Python, tick databases, Amazon S3, and large-scale data pipelines. You will learn from experienced traders and contribute to systems executing millions of crypto trades globally.

Requirements

  • Experience with Python, Tick databases (ie Clickhouse and/or Vertica) and Amazon S3
  • Experience with developing the collection of real time large scale data pipelines (with petabytes of data)
  • Experience with computing cluster management in aws (ray, dask, etc)
  • Experience with building research pipelines on these large sets of data
  • Extensive experience conducting data analysis and other ad hoc tooling to analyse time series data sets and other large sets of data
  • A bachelor's degree (or above) in Computer Science, Software Engineering or similar, with excellent results

Responsibilities

  • Develop, test and maintain high throughput, high volume distributed data architectures
  • Analyze, define and automate data quality improvements
  • Build and improve trading analytics systems
  • Create tools to automate the configuration, deployment, building and troubleshooting of the data pipelines
  • Develop strategies to make our data pipeline efficient, timely and robust in a 24/7 trading environment
  • Implement monitoring that measures the completeness and accuracy of captured data
  • Manage the impact that changes to trading systems and upstream protocols have on the data pipeline
  • Collaborate with traders and trading system developers to understand our data analysis requirements, and to continue to improve the quality of our stored data
  • Develop tools, APIs and screens to provide easy access to the archived data

Preferred Qualifications

  • Experience with data lakes, Amazon S3 or similar
  • Experience developing in C++ on linux
  • Protocol level network analysis experience
  • Experience with terraform
  • Experience with Clickhouse
  • Experience with technologies such as Hive, Hadoop, Snowflake, Presto or similar

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.

Similar Remote Jobs