Consultant, Data Engineering

Hakkoda Logo

Hakkoda

πŸ“Remote - Costa Rica

Summary

Join Hakkoda, an IBM Company, as a Consultant Data Engineer and become a trusted consultant, designing and implementing scalable data solutions on Snowflake and other cloud platforms. You will work closely with clients, driving meaningful change in data-driven organizations. This pivotal role involves constructing data ingestion pipelines, establishing data architecture, and implementing data governance and security protocols. The ideal candidate is a skilled data pipeline builder and data wrangler, comfortable navigating diverse data needs across multiple teams and systems. Contribute to a startup environment and support customers in their next-generation data initiatives. Hakkoda offers a collaborative atmosphere that values learning, growth, and hard work.

Requirements

  • Bachelor’s degree in engineering, computer science or equivalent area
  • 3+yrs in related technical roles with experience in data management, database development, ETL, and/or data prep domains
  • Experience developing data warehouses
  • Experience building ETL / ELT ingestion pipelines
  • Proficiency in using cloud platform services for data engineering tasks, including managed database services (Snowflake and its pros and cons vs Redshift, BigQuery etc) and data processing services (AWS Glue, Azure Data Factory, Google Dataflow)
  • Skills in designing and implementing scalable and cost-effective solutions using cloud services, with an understanding of best practices for security and compliance
  • Knowledge of how to manipulate, process and extract value from large disconnected datasets
  • SQL and Python scripting experience require
  • Strong interpersonal skills including assertiveness and ability to build strong client relationships
  • Strong project management and organizational skills
  • Ability to support and work with cross-functional and agile teams in a dynamic environment
  • Advanced English required

Responsibilities

  • Collaborate with database architects, data analysts, and data scientists to ensure consistent and optimal data delivery architecture across ongoing customer projects
  • Construct data ingestion pipelines
  • Establish sound data architecture
  • Implement stringent data governance and security protocols
  • Design and develop Snowflake Data Cloud solutions
  • Optimize data systems from their foundational stages

Preferred Qualifications

  • Scala and Javascript is a plus
  • Cloud experience (AWS, Azure or GCP) is a plus
  • Knowledge of any of the following tools is also a plus: Snowflake, Matillion/Fivetran or DBT

Benefits

  • Comprehensive Life Insurance: Including dental and vision, wellness, home spa treatments, express doctor visits etc
  • Paid Parental Leave
  • Flexible PTO Options
  • Company Bonus Program
  • AsociaciΓ³n Solidarista
  • Technical Training & Certifications
  • Extensive Learning and Development Opportunities
  • Flexible Work-from-Home Policy
  • Work from Anywhere Benefit

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.