πSpain
Analytics Engineer

Multiverse
πRemote - United Kingdom
Please let Multiverse know you found this job on JobsCollider. Thanks! π
Summary
Join Multiverse and help us set a new course for work. We are looking for an Analytics Engineer to build and maintain data models for analytics and data science. You will develop robust and scalable dbt pipelines and contribute to the evolution of our data platform, ensuring data is accessible, trusted, and well-structured. This hands-on role requires a strong technical foundation, problem-solving skills, proficiency in SQL, and collaboration with various teams. The role involves data modeling and transformation, testing, documentation, CI/CD, performance optimization, and working with Snowflake within a data lake architecture. The ideal candidate is detail-oriented, solution-driven, and takes ownership of their work.
Requirements
- 2+ years of building and optimising complex SQL (including complex joins, window functions and optimisation methods)
- Strong understanding of data modelling and warehouse design (e.g., Kimball-style dimensional modelling)
- Experience using dbt in production environments, including testing and documentation
- Familiar with version control (GitHub)
- Experience tuning dbt models and SQL queries for performance
- Able to independently transform business logic into technical implementation
- Comfortable participating in and contributing to code reviews
Responsibilities
- Build and maintain dbt models to transform raw data into clean, documented, and accessible data sets
- Translate business and analytics requirements into scalable data models
- Design and implement data warehouse schemas using dimensional modelling techniques (fact and dimension tables, slowly changing dimensions, etc.)
- Participate in design and code reviews to improve model design and query performance
- Implement and maintain dbt tests to ensure data quality and model accuracy
- Document data models clearly to support cross-functional use
- Use GitHub and CI/CD pipelines to manage code and deploy changes safely and efficiently
- Optimise dbt models and SQL queries for performance and maintainability
- Work with Snowflake; developing on top of a data lake architecture
- Ensure dbt models are well-integrated with data catalogs and accessible for downstream use
Preferred Qualifications
- Experience with Snowflake
- Experience with CI/CD for data workflows
- Familiarity with Python/Airflow for data transformation or orchestration tasks
- Experience with data visualisation tools (e.g., Tableau, Looker)
- Working knowledge of infrastructure-as-code tools like Terraform
Share this job:
Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.
Similar Remote Jobs
πPoland
πUnited States
πGermany
π°$170k
πUnited States

π°$100k-$180k
πWorldwide
π°$100k
πUnited States
πUnited States
πBrazil