Remote Senior Crypto Backend Engineer

Logo of Token Metrics

Token Metrics

πŸ“Remote - Worldwide

Job highlights

Summary

Token Metrics is seeking a Back End Engineer with at least 3 years of experience in Python, Java, SQL, and NoSQL to facilitate the operations of their Data Scientists and Engineering team. The engineer will be responsible for constructing frameworks using various tools and techniques, employing machine learning techniques, collaborating with coworkers, optimizing existing structures, testing them, building a data pipeline from different sources, preparing raw data for manipulation, implementing proper data validation and reconciliation methodologies, and ensuring backup and accessibility of work. The company is looking for someone with a Bachelor's degree in Data Engineering, Big Data Analytics, Computer Engineering, or related field, excellent analytical skills, independence, and the ability to manage a pipeline of duties with minimal supervision.

Requirements

  • Bachelor's degree in Data Engineering, Big Data Analytics, Computer Engineering, or related field
  • 3+ years of Python, Java or any programming language development experience
  • 3+ years of SQL & No-SQL experience (Snowflake Cloud DW & MongoDB experience is a plus)
  • 3+ years of experience with schema design and dimensional data modeling
  • Expert proficiency in SQL, NoSQL, Python, C++, Java, R
  • Expert with building Data Lake, Data Warehouse or suitable equivalent
  • Expert in AWS Cloud
  • Excellent analytical and problem-solving skills
  • A knack for independence and group work
  • Capacity to successfully manage a pipeline of duties with minimal supervision

Responsibilities

  • Liaising with coworkers and clients to elucidate the requirements for each task
  • Conceptualizing and generating infrastructure that allows big data to be accessed and analyzed
  • Reformulating existing frameworks to optimize their functioning
  • Testing such structures to ensure that they are fit for use
  • Building a data pipeline from different data sources using different data types like API, CSV, JSON, etc
  • Preparing raw data for manipulation by Data Scientists
  • Implementing proper data validation and data reconciliation methodologies
  • Ensuring that your work remains backed up and readily accessible to relevant coworkers
  • Remaining up-to-date with industry standards and technological advancements that will improve the quality of your outputs

Preferred Qualifications

Master's degree in a relevant field is an added advantage

Benefits

Not specified

Job description

Token Metrics is seeking a multi-talented Back End Engineer to facilitate the operations of our Data Scientists and Engineering team. Back End Engineer will be responsible to employ various tools and techniques to construct frameworks that prepare information using SQL, Python, R, Java and C++. The Big Data Engineer will be responsible for employing machine learning techniques to create and sustain structures that allow for the analysis of data while remaining familiar with dominant programming and deployment strategies in the field. During various aspects of this process, you should collaborate with coworkers to ensure that your approach meets the needs of each project.

Responsibilities

  • Liaising with coworkers and clients to elucidate the requirements for each task.
  • Conceptualizing and generating infrastructure that allows big data to be accessed and analyzed.
  • Reformulating existing frameworks to optimize their functioning.
  • Testing such structures to ensure that they are fit for use.
  • Building a data pipeline from different data sources using different data types like API, CSV, JSON, etc.
  • Preparing raw data for manipulation by Data Scientists.
  • Implementing proper data validation and data reconciliation methodologies.
  • Ensuring that your work remains backed up and readily accessible to relevant coworkers.
  • Remaining up-to-date with industry standards and technological advancements that will improve the quality of your outputs.

Requirements

  • Bachelor’s degree in Data Engineering, Big Data Analytics, Computer Engineering, or related field.
  • A Master’s degree in a relevant field is an added advantage.
  • 3+ years of Python, Java or any programming language development experience
  • 3+ years of SQL & No-SQL experience (Snowflake Cloud DW & MongoDB experience is a plus)
  • 3+ years of experience with schema design and dimensional data modeling
  • Expert proficiency in SQL, NoSQL, Python, C++, Java, R.
  • Expert with building Data Lake, Data Warehouse or suitable equivalent.
  • Expert in AWS Cloud.
  • Excellent analytical and problem-solving skills.
  • A knack for independence and group work.
  • Capacity to successfully manage a pipeline of duties with minimal supervision.

About Token Metrics

Token Metrics helps crypto investors build profitable portfolios using artificial intelligence-based crypto indices, rankings, and price predictions.

Token Metrics has a diverse set of customers, from retail investors and traders to crypto fund managers, in more than 50 countries.

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.
Please let Token Metrics know you found this job on JobsCollider. Thanks! πŸ™