Senior Data Engineer

Kittl Logo

Kittl

πŸ“Remote - Germany

Summary

Join Kittl, a rapidly growing company redefining graphic design, as a Senior Data Engineer. You will be responsible for architecting and driving the data platform, owning end-to-end infrastructure for real-time event pipelines and analytics tooling. This role involves leading initiatives across GCP, PubSub, Dagster, and DBT, ensuring reliable data flow for product analytics, experimentation, and decision-making. You will collaborate with various teams, translating business goals into scalable data architecture. Kittl offers a hybrid working culture with flexible hours and remote work options. The company is well-funded and boasts a diverse team.

Requirements

  • Experience: Solid background in deploying and managing cloud-native data infrastructure, ideally in GCP
  • Infrastructure: Production-level experience with containerized deployments and service orchestration
  • Data lifecycle: Comfortable owning the full data journey from event design to modeling, transformation, and activation
  • Analytics: Deep understanding of event-driven architecture and the trade-offs between various analytics data sources
  • Collaboration: Proven track record of translating business needs into scalable solutions in cross-functional environments

Responsibilities

  • Own and evolve data infrastructure: Own and continuously improve our data systems across GCP, including Compute Engine, PubSub, and BigQuery, to support a scalable analytics foundation
  • Develop event-driven pipelines: Build reliable, event-based data pipelines across frontend and backend systems to enable experimentation, personalization, and real-time analysis
  • Champion engineering excellence: Lead CI/CD processes, manage secrets securely, and ensure high-quality deployments using Docker, while promoting clean code and peer review practices in GitHub
  • Support analytics and data science: Design and implement advanced data transformations in Python and SQL, scale Dagster orchestration and dbt models to meet evolving analytical needs
  • Ensure reliable data tracking: Act as a key link between engineering and analytics, maintaining traceable event tracking and A/B testing infrastructure (Growthbook) to support growth initiatives
  • Monitor and optimize systems: Maintain ingestion tools like Airbyte, monitor and troubleshoot pipelines, and drive architectural improvements across backend and frontend data capture while optimizing for cost and performance

Preferred Qualifications

Experience supporting ML workflows or deploying models in production is a strong plus

Benefits

  • Flexible working hours: Our core hours are 11am–5pm CET, leaving the rest of your schedule flexible to fit your style
  • Remote work: Work up to 50 days (10 weeks) fully remote per year from anywhere in the world, as long as you maintain our core hours
  • Learning & development : Our L&D budget supports your professional growth
  • Mobility benefit: We fully cover your monthly BVG public transport ticket
  • Health and fitness: Urban Sports Club membership discount
  • Vacation: Up to 30 vacation days per year

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.