Senior Product Analyst

Dremio Logo

Dremio

πŸ“Remote - Portugal

Summary

Join Dremio as an Analytics Engineer and build data pipelines for our internal data lake. You will work across the analytics stack, extracting, transforming, and managing data quality to create data products and dashboards. Engage with stakeholders to understand their data needs and collaborate to deploy tracking and analytics tools. Maintain data pipelines, build enriched telemetry data for anomaly detection, and configure integrations with various tools. Create and QA dashboards and reports, ensuring accuracy. This role requires strong technical skills, experience with data pipelines and analytics tools, and excellent project management skills.

Requirements

  • Bachelors or Masters in Data Science, Computer Science, Math or equivalent
  • At minimum 5 years of experience as an Analytics Engineer, Data Engineer, or similar role
  • Expert in designing and implementing data pipelines using Python
  • Highly skilled SQL expert with a clear understanding of window functions, cte’s, subqueries etc
  • Basic understanding/familiarity of Iceberg, or existing knowledge of a similar datalake table format with a strong desire to learn about Iceberg
  • Experience creating reports and dashboards using Tableau or equivalent tool
  • Experience working with AWS, GCP or Azure cloud based storage
  • Experience working with tools like FiveTran, Stitch or equivalent
  • Experience with product analytics data solutions like Intercom, Heap and Google Analytics
  • Strong familiarity with dbt
  • Experience with ticketing systems (JIRA), communication tools (Slack), repository systems (GitHub)
  • Excellent project management skills with a proven track record of cross-functional impact
  • Interested and motivated to be part of a fast-moving startup with a fun and accomplished team

Responsibilities

  • Engage with stakeholders to understand, document and prioritize data driven decision making needs
  • Collaborate cross functionally to deploy necessary tracking / analytics tools to support product usage telemetry data needs
  • Create and maintain data pipelines adhering to semantic layer best practices for internal Data Lake
  • Build and maintain data pipelines to provide enriched telemetry data for anomaly detection use-cases
  • Configure, deploy and maintain integrations with popular tools such as GitHub and Slack
  • Develop and manage data pipelines to support internal analytics, data needs, and business requests
  • Establish, document, communicate dashboarding and reporting dimensions
  • Seek opportunities for innovation to further empower our decision makers, mature our Lakehouse and drive the business
  • Create Dashboards on product telemetry, user activity, pipeline health, documentation usage, etc
  • QA Dashboards and Reports against the raw data and partner with Data Lake owners to ensure accuracy

Benefits

Workplace Wednesdays - to break down silos, build relationships and improve cross-team communication. Lunch catering / meal credits provided in the office and local socials align to Workplace Wednesdays. In general, Dremio will remain a hybrid work environment. We will not be implementing a 100% (5 days a week) return to office policy for all roles

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.

Similar Remote Jobs