Senior Data Engineer

Kargo
Summary
Join Kargo, a leading cross-screen ad experience creator, as a Senior Data Engineer. You will play a key role in evolving our data infrastructure, focusing on hands-on implementation, problem-solving, and exploring new technical approaches. Collaborate with technical leads and peers to enhance and scale data processes for powerful targeting systems. This remote, permanent position requires expertise in large-scale data systems, Python, Spark, and Iceberg. You will implement and maintain ETL/ELT pipelines, contribute to testing strategies, and ensure operational reliability and cost-effectiveness. Familiarity with AdTech identity, privacy, and targeting methodologies is essential.
Requirements
- Strong expertise in implementing, maintaining, and optimizing large-scale data systems with minimal oversight
- Deep proficiency in Python, Spark, and Iceberg, with a clear understanding of data structuring for efficiency and performance
- Extensive DevOps experience, particularly with AWS (including EKS), Docker, Kubernetes, CI/CD automation using ArgoCD, and monitoring via Prometheus
- Familiarity with Snowflake, including writing and optimizing SQL queries and understanding Snowflake's performance and cost dynamics
- Comfort with Agile methodologies, including regular use of Jira and Confluence for task management and documentation
- Proven ability to independently drive implementation and problem-solving, turning ambiguity into clearly defined actions
- Excellent communication skills to effectively engage in discussions with technical teams and stakeholders
- Familiarity with identity, privacy, and targeting methodologies in AdTech is required
Responsibilities
- Independently implement, optimize, and maintain robust ETL/ELT pipelines using Python, Airflow, Spark, Iceberg, Snowflake, Aerospike, Docker, Kubernetes (EKS), AWS, and real-time streaming technologies like Kafka and Flink
- Engage proactively in collaborative design and brainstorming sessions, contributing technical insights and innovative ideas for solving complex data engineering challenges
- Support the definition and implementation of robust testing strategies, and guide the team in adopting disciplined CI/CD practices using ArgoCD to enable efficient and reliable deployments
- Monitor and optimize data systems and infrastructure to ensure operational reliability, performance efficiency, and cost-effectiveness
- Actively contribute to onboarding new datasets, enhancing targeting capabilities, and exploring modern privacy-compliant methodologies
- Maintain thorough documentation of technical implementations, operational procedures, and best practices for effective knowledge sharing and onboarding
Preferred Qualifications
Experience with Airflow for building robust data workflows is strongly preferred