Data Platform Engineer

Vector Limited Logo

Vector Limited

πŸ“Remote - New Zealand

Summary

Join Vector, a company modernizing its electricity distribution network and embedding smart technologies, as a Data Platform Engineer. You will design, implement, and support cloud-based data solutions, working within an Agile DevOps team. Responsibilities include building and maintaining data platforms, integrating machine learning models, and providing DevOps support. The ideal candidate has at least 3 years of experience in platform or data engineering, proficiency in data engineering concepts, and hands-on experience with AWS data lake technologies and various data tools. Vector offers a collaborative environment, professional development opportunities, flexible work options, comprehensive health and wellbeing support, and great family-focused benefits.

Requirements

  • At least 3 years of professional experience as a Platform Engineer or Data Engineer, with a strong foundation in designing and implementing scalable data solutions
  • Proficiency in data engineering concepts, tools, and best practices, including data modeling, ETL processes, and pipeline architecture
  • Hands-on experience with: AWS data lake technologies (IAM, S3, EC2)
  • Data ingestion tools (Fivetran, AWS Glue)
  • Data transformation tools (Coalesce, DBT)
  • Snowflake cloud data platform
  • CI/CD pipeline development
  • Working knowledge of BDD and TDD, enabling the development of reliable and testable data solutions
  • Experience with Agile methodologies and a DevOps culture, fostering collaboration and continuous improvement
  • Strong interpersonal skills, with a proven ability to build and maintain effective relationships with diverse stakeholders through clear and empathetic communication
  • Adaptability in dynamic matrix environments, effectively collaborating across cross-functional teams and adjusting to evolving industry trends, priorities, tools, processes and workflows

Responsibilities

  • Design and implement data flow and processing pipelines using AWS Glue, Lambda, and Fivetran
  • Build and maintain CI/CD pipelines in AWS cloud
  • Integrate machine learning models using services like AWS SageMaker
  • Provide DevOps support for our Enterprise Data Warehouse and BI & Analytics platform
  • Design and implement unit tests for data processing pipelines
  • Contribute to technical leadership and architecture decisions

Benefits

  • Work in a collaborative and inclusive environment with a strong focus on professional development
  • Enjoy flexibility, including work-from-home options and flexible hours
  • Comprehensive health and wellbeing support
  • Great benefits that reflect the importance of family

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.

Similar Remote Jobs