Senior Analytics Engineer
Dremio
πRemote - United Kingdom
Please let Dremio know you found this job on JobsCollider. Thanks! π
Job highlights
Summary
Join Dremio, a unified lakehouse platform company, as a Data Engineer responsible for data pipelines, data quality, and reporting within our Data Lake. You will collaborate with various teams to understand data needs, build and maintain pipelines, and create dashboards. This role requires expertise in data pipelines, SQL, and cloud technologies. You will work with stakeholders to prioritize data-driven decision-making needs and contribute to a fast-paced, innovative environment. The ideal candidate possesses strong analytical and communication skills and experience with relevant tools and technologies.
Requirements
- Bachelors or Masters in Data Science, Computer Science, Math or equivalent
- At minimum 5 years of experience as an Analytics Engineer, Data Engineer, or similar role
- Expert in designing and implementing data pipelines using Python
- Highly skilled SQL expert with a clear understanding of window functions, cteβs, subqueries etc
- Basic understanding/familiarity of Iceberg, or existing knowledge of a similar datalake table format with a strong desire to learn about Iceberg
- Experience creating reports and dashboards using Tableau or equivalent tool
- Experience working with AWS, GCP or Azure cloud based storage
- Experience working with tools like FiveTran, Stitch or equivalent
- Experience with product analytics data solutions like Intercom, Heap and Google Analytics
- Strong familiarity with dbt
- Experience with ticketing systems (JIRA), communication tools (Slack), repository systems (GitHub)
- Excellent project management skills with a proven track record of cross-functional impact
Responsibilities
- Engage with stakeholders to understand, document and prioritize data driven decision making needs
- Collaborate cross functionally to deploy necessary tracking / analytics tools to support product usage telemetry data needs
- Create and maintain data pipelines adhering to semantic layer best practices for internal Data Lake
- Build and maintain data pipelines to provide enriched telemetry data for anomaly detection use-cases
- Configure, deploy and maintain integrations with popular tools such as GitHub and Slack
- Support pipeline development in response to internal data, analytics and requests required by the business
- Establish, document, communicate dashboarding and reporting dimensions
- Seek opportunities for innovation to further empower our decision makers, mature our Lakehouse and drive the business
- Create Dashboards on product telemetry, user activity, pipeline health, documentation usage, etc
- QA Dashboards and Reports against the raw data and partner with Data Lake owners to ensure accuracy
Preferred Qualifications
Interested and motivated to be part of a fast-moving startup with a fun and accomplished team
Benefits
#LI-Remote
Share this job:
Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.
Similar Remote Jobs
- πBrazil
- πBrazil
- π°$159k-$194kπWorldwide
- πEurope
- π°$170k-$220kπUnited States
- π°$120k-$140kπUnited States
- πWorldwide
- π°$137k-$185kπUnited States
- πWorldwide
Please let Dremio know you found this job on JobsCollider. Thanks! π