Summary
Join Docplanner as an Analytics Engineer and contribute to the development and maintenance of their data platform. You will build and improve the data warehouse, collaborate with analytics teams, and ensure data integrity. The role involves data modeling, DWH engineering, web tracking, and data governance initiatives. You will work with various technologies, including dbt, Data Vault, SQL, Python, and cloud technologies. The company offers a flexible work environment, remote options, and various benefits.
Requirements
- Extensional experience connected with data warehousing and/or database administration
- Expert in SQL, particularly in query optimization and database design
- Strong skills in Python and SQL , following good coding practices
- Strong data modeling skills (you can name at least five differences between Kimball and Inmon :))
- Deep knowledge of BI, data modeling techniques, and trends (e.g., Data Vault, Kimball, Data Mesh)
- Strong project management skills and experience managing projects end-to-end, ensuring timely delivery and alignment with business goals
- Excellent communication and interpersonal skills
- Growth mindset: nobody ticks all those boxes above, but a willingness to learn is strongly valued here
- Hands-on experience with orchestrators like Airflow* or Dagster
- Ability to work collaboratively and adopt code versioning best practices (git)
- Hands-on experience with cloud technologies and ETL/ELT tools (one among AWS, GPC, Azure, dbt, dagster, airflow, Luigi)
- Good level of English
Responsibilities
- Build, automate and assure the quality of data products in cooperation with analysts and data engineers
- Own and improve the DWH data model, making it more accessible and easier to use
- Own and maintain the DWH cluster
- Apply expertise in dbt, data domainisation, and Data Vault
- Collaborate with analytics teams on building data models for different business areas
- Assure technical guidelines and a big picture view
- Support teams to own their data models and gain autonomy
- Advocate for best practices in data engineering and analytics, ex. by leading workshops and training
- Align data initiatives with business goals
- Proactively communicate with internal stakeholders
Preferred Qualifications
- *Airflow is nice to have, but the must is an orchestrator
- Hands-on experience with dbt in large-scale deployments
- Knowledge of Spark and Scala language are nice to have
- Aligned with our stack (e.g.: AWS ecosystem, Redshift, S3, Tableau, Superset)
- Infrastructure knowledge: Docker, Terraform, Kubernetes, Helm
Benefits
- A salary adequate to your experience and skills
- Flexible remuneration and benefits system via Flexoh , which includes: restaurant card, transportation card, kindergarten, and training tax savings
- Share options plan after 6 months of working with us
- Remote or hybrid work model with our hub in Barcelona
- Flexible working hours (fully flexible, as in most cases you only have to be on a couple of meetings weekly)
- Summer intensive schedule during July and August (work 7 hours, finish earlier)
- 23 paid holidays, with exchangeable local bank holidays
- Additional paid holiday on your birthday or work anniversary (you choose what you want to celebrate)
- Private healthcare plan with Adeslas for you and subsidized for your family (medical and dental)
- Access to hundreds of gyms for a symbolic fee in partnership for you and your family with Andjoy
- Access to iFeel , a technological platform for mental wellness offering online psychological support and counseling
- Free English and Spanish classes
Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.