Remote Senior Data Engineer

Logo of Fleetio

Fleetio

πŸ“Remote - United States, Canada

Job highlights

Summary

Join our data and analytics team as a senior data engineer to help build the next generation of our data platform and product. You will be responsible for designing and implementing our data platform, building data products for marketplaces, integrations, and analytics, and working closely with our product and engineering teams.

Requirements

  • 5+ years experience working in a data engineering or data-focused software engineering role
  • Experience transforming raw data into clean models using standard tools of the modern data stack and deep understanding of ELT and data modeling concepts
  • Experience with streaming data and pipelines such as kafka or kinesis
  • Proficiency in python and proven track record of delivering production-ready python applications
  • Experience in designing, building, and administering modern data pipelines and data warehouses
  • Experience with dbt
  • Experience with semantic layers like cube or metricflow
  • Experience with Snowflake, BigQuery, or Redshift
  • Experience with version control tools such as Github or Gitlab
  • Experience with ELT tools such as Stitch or Fivetran
  • Experience with orchestration tools such as Prefect or Dagster
  • Experience with CI/CD and IaaC tooling such as github Actions and Terraform
  • Experience with business intelligence solutions (Metabase, Looker, Tableau, Periscope, Mode)
  • Experience with serverless cloud functions (AWS Lambda, Google Cloud Functions, etc.)
  • Excellent communication and project management skills with a customer service focused mindset

Responsibilities

  • Enable and scale self-serve analytics for all Fleetio team members
  • Develop data destinations, custom integrations and maintain open source packages that allow our customers to easily integrate Fleetio data with the modern data stack
  • Maintain and develop custom data pipelines from operational source systems to our data platform for both streaming and batch sources
  • Work on development of our internal data infrastructure stack. Improve the hygiene and integrity of our data platform and ancillary tools by maintaining and monitoring the ELT pipeline
  • Architect, design and implement core components of our data platform beyond the traditional warehouse to include data observability, experimentation, data science and other data products
  • Develop and maintain streaming data pipelines from a variety of databases and data sources
  • Collaborate with other Fleetians around the company to understand data needs and ensure required data is collected, modeled, and available to team members
  • Document best practices and coach/advise other data analysts, product managers, engineers, etc. on data modeling, SQL query optimization & reusability, etc. Keep our data platform tidy by managing roles and permissions and deprecating old projects

Benefits

  • Multiple health/dental coverage options
  • Vision insurance
  • Incentive stock options
  • 401(k) match of 4%
  • PTO - 4 weeks
  • 12 company holidays + 2 floating holidays
  • Parental leave- birthing parent (12 weeks paid) non-birthing (4 weeks)
  • FSA & HSA options
  • Short and long term disability (short term 100% paid)
  • Community service funds
  • Professional development funds
  • Wellbeing fund - $150 quarterly
  • Business expense stipend- $125 quarterly
  • Mac laptop + new hire equipment stipend
  • Monthly catered lunches
  • Fully stocked kitchen with tons of drinks & snacks
  • Remote working friendly since 2012 #LI-REMOTE

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.
Please let Fleetio know you found this job on JobsCollider. Thanks! πŸ™