Senior Crypto Backend Engineer

Token Metrics Logo

Token Metrics

πŸ“Remote - Greece

Summary

Join Token Metrics as a Back End Engineer to collaborate with Data Scientists and Engineers. You will build frameworks for data preparation using SQL, Python, R, Java, and C++, employing machine learning techniques to create and maintain data analysis structures. Collaborate with colleagues to ensure project needs are met. Responsibilities include liaising with coworkers and clients, designing data infrastructure, optimizing existing frameworks, testing structures, building data pipelines, preparing raw data, implementing data validation, ensuring data backup, staying current with industry standards, and using various tools and techniques to construct frameworks that prepare information. The role requires a bachelor's degree in a relevant field, 3+ years of experience in Python, Java, or a similar language, SQL & NoSQL experience, and expertise in schema design and dimensional data modeling. Expert proficiency in several programming languages and cloud technologies is also needed.

Requirements

  • Bachelor's degree in Data Engineering, Big Data Analytics, Computer Engineering, or related field
  • 3+ years of Python, Java or any programming language development experience
  • 3+ years of SQL & NoSQL experience (Snowflake Cloud DW & MongoDB experience is a plus)
  • 3+ years of experience with schema design and dimensional data modeling
  • Expert proficiency in SQL, NoSQL, Python, C++, Java, R
  • Expert with building Data Lake, Data Warehouse or suitable equivalent
  • Expert in AWS Cloud
  • Excellent analytical and problem-solving skills
  • A knack for independence and group work
  • Capacity to successfully manage a pipeline of duties with minimal supervision

Responsibilities

  • Liaising with coworkers and clients to elucidate the requirements for each task
  • Conceptualizing and generating infrastructure that allows big data to be accessed and analyzed
  • Reformulating existing frameworks to optimize their functioning
  • Testing such structures to ensure that they are fit for use
  • Building a data pipeline from different data sources using different data types like API, CSV, JSON, etc
  • Preparing raw data for manipulation by Data Scientists
  • Implementing proper data validation and data reconciliation methodologies
  • Ensuring that your work remains backed up and readily accessible to relevant coworkers
  • Remaining up-to-date with industry standards and technological advancements that will improve the quality of your outputs

Preferred Qualifications

A Master's degree in a relevant field is an added advantage

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.