DataOps Team Lead I

Adaptiq Logo

Adaptiq

πŸ“Remote - Poland

Summary

Join Adaptiq's Bigabid team as a DataOps Team Lead, leading a growing team and owning the stability, observability, and quality of data pipelines. You will build, maintain, and monitor robust data pipelines using Python and Airflow, acting as the go-to expert for resolving data issues. This role requires collaboration with multiple teams, including Data Engineering, Product, BI, Operations, Backend, and Data Science. You will also own strategic projects like metadata store development and anomaly detection systems. The ideal candidate will balance project leadership with hands-on technical work, fostering a culture of quality and communication. This is a 50/50 hands-on and leadership role, perfect for someone who thrives at the intersection of data engineering, operational excellence, and cross-functional collaboration.

Requirements

  • 3+ years of experience in data engineering/data operations, with at least 1 year of team or project leadership
  • Proficient in Python for scripting and automation (clean, logical code – not full-stack development)
  • Strong experience with Airflow (hands-on, not through abstraction layers)
  • Solid understanding of SQL and NoSQL querying, schema design, and cost-efficient querying (e.g., Presto, document DBs)
  • Comfortable managing incident escalation, prioritizing urgent fixes, and guiding teams toward solutions
  • Analytical, communicative, and excited to work with smart, mission-driven people

Responsibilities

  • Lead a DataOps team of 2 (and growing), with ownership over Bigabid’s core data quality and observability processes
  • Build, maintain, and monitor robust data pipelines and workflows (Python + Airflow)
  • Act as the go-to person for identifying and resolving data issues affecting production systems
  • Coordinate with multiple teams: Data Engineering, Product, BI, Operations, Backend, and occasionally Data Science
  • Own projects such as metadata store development, anomaly detection systems, and scalable data quality frameworks
  • Balance strategic project leadership with hands-on scripting, debugging, and optimizations
  • Promote a culture of quality, reliability, and clear communication in a fast-moving, high-volume environment

Preferred Qualifications

  • Previous experience as a NOC or DevOps engineer
  • Familiarity with PySpark

Benefits

  • We provide 20 days of vacation leave per calendar year (plus official national holidays of the country you are based in)
  • We provide full accounting and legal support in all countries where we operate
  • We utilize a fully remote work model with a powerful workstation and co-working space in case you need it
  • We offer a highly competitive package with yearly performance and compensation reviews

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.