onXmaps, Inc. is hiring a
Senior Data Engineer

Logo of onXmaps, Inc.

onXmaps, Inc.

πŸ’΅ $150k-$175k
πŸ“Remote - United States

Summary

Join a dynamic team at onX, a pioneer in digital outdoor navigation, and contribute to creating scalable data pipelines and infrastructure to support analytics, machine learning, and data-driven decision-making.

Requirements

  • 8+ years of experience in data engineering, with at least three years working in a cloud environment (preferably GCP)
  • Expertise in GCP services, including BigQuery, Dataflow, Pub/Sub, Cloud Composer, Cloud Storage, Cloud Functions, and Cloud SQL
  • Strong proficiency in SQL and experience with writing complex queries for data extraction and analysis
  • Hands-on experience with ETL development and workflow orchestration using tools like Apache Airflow
  • Proficiency in one or more programming languages like Python, or Scala for data processing and pipeline development
  • Experience with streaming data pipelines (e.g., using Pub/Sub, Dataflow) and batch processing
  • Understanding of data warehousing concepts and experience working with large datasets and query optimization
  • Knowledge of best practices in data governance, security, and compliance, particularly in a cloud environment
  • Strong analytical, problem-solving, and troubleshooting skills with attention to detail
  • Excellent communication and collaboration skills, with the ability to work effectively in a cross-functional team

Responsibilities

  • Design and build scalable data pipelines using GCP services like Cloud Dataflow, Cloud Pub/Sub, and Cloud Composer to ingest, process, and store large datasets from multiple sources
  • Develop, test, and maintain data models, schemas, and ETL (Extract, Transform, Load) processes using tools like BigQuery, Cloud SQL, and Data Studio
  • Collaborate with stakeholders to understand business requirements and translate them into effective data infrastructure solutions
  • Optimize data pipelines for performance, scalability, and cost-efficiency using GCP-native tools such as Dataflow, Dataproc, and Bigtable
  • Ensure data quality, integrity, and security by developing validation processes and enforcing best practices for data governance and access control using GCP Dataplex and Security tools
  • Automate workflows and processes using Cloud Composer (Apache Airflow) to ensure data pipelines run reliably and on schedule
  • Perform troubleshooting and root cause analysis on data pipelines and infrastructure issues, ensuring high availability and reliability
  • Collaborate with SRE and Infrastructure teams to manage GCP resources efficiently, including compute, storage, and network resources
  • Stay updated with GCP services, best practices, and emerging technologies to improve and optimize the data platform continuously

Benefits

  • Competitive salaries
  • Annual bonuses
  • Equity
  • Comprehensive health benefits including a no-monthly-cost medical plan
  • Parental leave plan of 5 or 13 weeks fully paid
  • 401k matching at 100% for the first 3% you save and 50% from 3-5%
  • Company-wide outdoor adventures and amazing outdoor industry perks
  • Annual β€œGet Out, Get Active” funds to fuel your active lifestyle in and outside of the gym
  • Flexible time away package that includes PTO, STO, VTO, quiet weeks, and floating holidays

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.

Similar Jobs

Please let onXmaps, Inc. know you found this job on JobsCollider. Thanks! πŸ™