Summary
Join Snowplow, a leader in customer data infrastructure for AI, as a Senior Software Engineer specializing in Go development for our Data Processing team. You will design, build, and test real-time data services on GCP/AWS/Azure, ensuring high-throughput and reliability. Collaborate with cross-functional teams, review code, manage CI/CD pipelines, and monitor system health. The ideal candidate is growth-oriented, passionate about technology, and committed to building exceptional data pipelines. This role offers autonomy, responsibility, and the opportunity to solve complex problems. Snowplow values a diverse and inclusive team.
Requirements
- Solid experience in software development, particularly in Go (Golang)
- Experience building scalable applications including database optimization and integration design
- Experience profiling, monitoring and improving application performance
- Experience with continuous integration and continuous deployment (CI/CD) practices
- Proficiency with tools like Terraform / IaC tooling and GitHub Actions
- Familiarity with containerization tools such as Docker
- Experience with cloud-based services and environments (e.g., AWS, GCP, Azure)
- Excellent problem-solving skills and attention to detail
- You approach software delivery pragmatically, balancing rapid learning with a commitment to reliable, trusted service for our customers
Responsibilities
- Design, build and test real-time data services (e.g., identity graphs, attribution) on GCP/AWS/Azure, delivering reliable, high-quality code
- Build robust QA, unit and integration tests both within our Go projects, and using our Go-based automated QA framework
- Collaborate in Scrum ceremonies and engage with cross-functional teams for requirements
- Review code to maintain quality and provide constructive feedback
- Manage CI/CD pipelines for automated deployments and reliability
- Monitor system health with observability tools and address issues proactively
- Engage with stakeholders for alignment on project goals and updates
- Research new technologies to improve the Snowplow ecosystem
Preferred Qualifications
- Familiarity with identity resolution, graph algorithms and databases
- Experience working with soft-realtime data-driven systems
- An understanding of event-driven architectures and data processing pipelines
- Experience with Kubernetes, particularly in the context of data processing workflows
- Knowledge of Snowplow products and services
- Experience with data analytics platforms and tools
- Expertise with observability tools like Grafana and Sentry
Benefits
- A competitive package, including share options
- Flexible working
- A generous holiday allowance no matter where you are in the world
- MacBook and home office equipment allowance
- Enhanced maternity, paternity, shared parental and adoption leave
Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.