Data Analytics Engineer

Capella Space Logo

Capella Space

πŸ’΅ $86k-$115k
πŸ“Remote - Worldwide

Summary

Join Capella Space's DataOps team as a Data Analytics Engineer to build the next-generation data analytics infrastructure. Design, build, and optimize scalable and reliable data pipelines ingesting data from various sources. Provide a clean, reliable, and timely data foundation for engineers and stakeholders. Collaborate with teammates on data modeling and transformation using dbt. Ensure high availability, reliability, and timeliness of data pipelines through orchestration, scheduling, and monitoring. Implement data quality checks and testing frameworks. Act as the primary data provider for the DataOps team, creating comprehensive documentation for data pipelines and processes.

Requirements

  • 3+ years of hands-on experience in a Data Engineering role
  • A desire to build scalable, robust, and interesting platforms
  • Enjoys learning new technologies and approaches to problem-solving
  • Strong analytical and problem-solving skills, with a focus on building efficient systems
  • Detail-oriented with a passion for data quality, accuracy, and reliability
  • Excellent communication and cross-team collaboration skills
  • Strong proficiency in Python for data engineering tasks (e.g., scripting, data manipulation, API integration)
  • Strong proficiency in SQL for complex querying and data modeling
  • Proven experience building and maintaining data pipelines, especially from sources like third-party APIs and databases using Change Data Capture (CDC)
  • Hands-on experience with a cloud data warehouse (Snowflake strongly preferred)
  • Experience with data transformation tools, preferably DBT
  • Experience with CI/CD tools (e.g., GitLab CI)

Responsibilities

  • Data Pipeline Development & Ingestion: Design, build, and maintain efficient and resilient data pipelines to extract and load data into data warehouse from third-party APIs, databases (leveraging CDC), and various internal systems
  • Data Modeling & Transformation: Collaborate with teammates to implement and support dbt data models, ensuring raw data is effectively transformed for analytical use
  • Pipeline Orchestration & Monitoring: Assist in the orchestration, scheduling, and monitoring of data pipelines to ensure high availability, reliability, and timeliness of data
  • Data Quality & Testing: Implement data quality checks, validation rules, and test frameworks within the pipelines to ensure the accuracy and integrity of the data
  • Collaboration & Support: Act as the primary data provider for the DataOps team, ensuring clean, structured data needed for advanced modeling, analytics, and data application development
  • Documentation: Create and maintain comprehensive documentation for data pipelines, sources, and processes to support transparency and knowledge sharing

Preferred Qualifications

  • Familiarity with creating data apps or dashboards
  • Broad experience with a major cloud platform (AWS, Azure, or Google Cloud)
  • Experience with Dimensional Data Warehouse models

Benefits

  • We provide extensive medical coverage, including strong vision and dental plans, flexible spending accounts, and additional supplemental health options
  • 401K Plan to invest in your long-term retirement goals
  • Generous Parental Leave
  • Paid Flexible Time Off Policy
  • Lifestyle Spending Account
  • Commuter & Parking Benefits
  • Mental Health Resources
  • Monthly Phone Stipend
  • Furry friends? We’ve got you covered with dog-friendly work environment & them with pet insurance options

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.

Similar Remote Jobs