Movable Ink is hiring a
Principal Data Engineer

Logo of Movable Ink

Movable Ink

πŸ’΅ $230k-$250k
πŸ“Remote - United States

Summary

Join Movable Ink as a Principal Data Engineer to drive the direction of our Data Warehouse and empower teams to make data-driven decisions.

Requirements

  • 12+ years of professional experience in data engineering, software engineering, database administration, business intelligence, or related field with 8+ years of that experience as a Data Engineer with a focus on cloud-based Data Warehouse platforms (Redshift, Snowflake, Firebolt, BigQuery). We currently use Redshift
  • Elite-level understanding on how to work with and optimize multi-petabyte, mission-critical databases, ensuring high availability, performance, and reliability informed by a strong understanding of database internals
  • Elite-level proficiency with Python and SQL languages, and significant experience building robust data pipelines with these languages
  • Elite-level proficiency in using, deploying and managing at least one data pipeline orchestration tool/framework such as Apache Airflow, Prefect, etc. We currently use Apache Airflow
  • Significant experience in building solutions that comply with regulatory requirements such as GDPR and CCPA
  • Significant experience in designing and implementing solutions that can support both batch and real-time data consumption models
  • Significant experience in building solutions that implement data security best practices within an AWS environment
  • Significant experience in providing technical leadership, setting best practices, and successfully driving the adoption of new technologies and methodologies within a fast-moving organization
  • Significant data modeling experience spanning more than one data modeling paradigm (e.g. Data Vault, Kimball/Ross, Inmon)
  • Experience working in an Agile/Scrum environment, has experience working with technical managers and product owners/manager to break down high-level requirements into actionable cards
  • Experience working with streaming platforms such as Apache Kafka and Apache Pulsar
  • Excellent communication skills for effective collaboration with cross-functional teams

Responsibilities

  • Partner with internal operations teams to identify, collect, and integrate data from various business systems, ensuring comprehensive and accurate data capture
  • Design, implement, and maintain robust data pipelines that feed data into our Data Platform, ensuring high performance, scalability, and reliability
  • Ensure data pipelines adhere to best practices and are optimized for performance and scalability
  • Conduct thorough testing of data pipelines to validate data accuracy and integrity
  • Monitor data pipelines, troubleshoot any issues that arise, and make improvements to these issues where applicable
  • Establish and track SLAs for data processing and delivery, ensuring timely and reliable access to data for all users
  • Become a mentor for less experienced team members, and establish patterns and practices that can be followed to increase quality, accuracy, and efficiency of solutions produced by the team
  • Work with other teams in order to ensure access to data corresponds with company policies, and ensure data access, processing, and storage is in compliance with regulatory (e.g. GDPR, CCPA, etc.) requirements

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.

Similar Remote Jobs

Please let Movable Ink know you found this job on JobsCollider. Thanks! πŸ™