Datacom is hiring a
Data Senior Consultant

closed
Logo of Datacom

Datacom

💵 ~$120k-$170k
📍Remote - New Zealand

Summary

The job is for a Snowflake Data Engineer within Datacom's growing analytics team. The role involves designing data platforms, creating software solutions, developing and operationalizing reliable data pipelines, building analytics tools, and infrastructure development. The position can be based anywhere, and the successful candidate should have extensive knowledge of Snowflake’s architecture and features, strong programming skills, and experience in various cloud platforms, databases, distributed processing technologies, and CI/CD Pipelines.

Requirements

  • Extensive knowledge of Snowflake’s architecture and features
  • Experience in the design, development and implementation of data solutions using Snowflake
  • Experience in DBT and/or Coalesce
  • Strong programming skills, especially Python, Java, Scala, C++, C# etc
  • Cloud platforms such as Azure, AWS & GCP
  • Relational database management systems such as SQL Server, Redshift etc
  • Distributed processing technologies such as Apache Spark
  • Working knowledge of message queuing, stream processing, and highly scalable data stores
  • Experience building CI/CD Pipelines such as GITHUB, Azure DevOps, etc
  • Degree in Computer Science, Data Science, Statistics or related field
  • Ability to connect to customer’s specific business problems and Snowflake’s solutions
  • Minimum 3+ years’ experience in Snowflake development

Responsibilities

  • Design data platforms, distributed systems, data lakes & data stores
  • Creating software solutions for data ingest & integration
  • Developing and operationalising reliable data pipelines & ETL patterns
  • Building analytics tools to provide actionable insights and solve business problems
  • Infrastructure development
  • Wrangling and integrating data from multiple sources
  • Identifying ways to improve data reliability, efficiency and quality

Preferred Qualifications

  • Machine Learning frameworks and theory
  • Analytic platforms & tools such as Databricks, Alteryx, SAS, KNIME or Datarobot
  • Data vault / Kimball modelling methodology
  • DevOps / DataOps

Benefits

  • Social events
  • Chill-out spaces
  • Remote working
  • Flexi-hours
  • Professional development courses
This job is filled or no longer available

Similar Jobs