Senior Data Engineer I

Splice Logo

Splice

πŸ’΅ $142k-$155k
πŸ“Remote - United States

Summary

Join Splice as a Sr. Data Engineer I and contribute to building and maintaining our data warehouse and pipelines. You will create self-service tools, address scalability issues, and ensure data quality. This remote role requires strong Python, SQL, and Unix skills, along with experience in data warehousing and transformation frameworks. You will champion data literacy and work in a collaborative, remote environment. The role involves on-call rotation to ensure system uptime. Apply today to join our team!

Requirements

  • 5+ years of experience building scalable and durable software
  • Demonstrated mastery of Python, SQL, and Unix fundamentals
  • Demonstrated operational excellence in maintaining Data Warehouses, such as GCP BigQuery or AWS RedShift
  • Strong familiarity with data transformation frameworks, such as sqlmesh or dbt
  • Experience with business intelligence platforms or data visualization frameworks like Looker, Hashtable, or Observable
  • Strong debugging skills, especially with distributed systems
  • Experience building supporting Cloud Infrastructure with Google Cloud Platform (GCP) and Amazon Web Services (AWS)
  • Clear and consistent communication in a distributed environment

Responsibilities

  • Own and operate the structure of our Data Warehouse, ensuring reliable ingestion of mission-critical data and reliable builds of our pipelines
  • Build and maintain self-service tools and extensible datasets that enable our peers across the organization to get the insight they need
  • Identify, scope, and execute projects that address scalability issues in our batch builds, automate manual workflows, and add confidence to our analytics by simplifying our datasets
  • Ensure the quality of our data by writing tests, building observability into our pipelines, reviewing RFCs, and providing guidance in data modeling
  • Participate in a business hours-only on-call rotation to ensure the uptime and quality of our systems
  • Creating and cultivating a culture of data literacy, experimentation, and data-driven decision making

Preferred Qualifications

  • Experience building Infrastructure as Code (IaC) with Terraform
  • Demonstrated proficiency with observability tools like StatsD, Datadog, Cloudwatch, etc
  • Demonstrated proficiency with containers and container orchestration

Benefits

Remote work

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.