Software Engineer, Data Infrastructure - Kafka
Canonical
πRemote - United States
Please let Canonical know you found this job on JobsCollider. Thanks! π
Job highlights
Summary
Join Canonical's data platform team to develop managed solutions for various data stores and technologies, and contribute to building a comprehensive automation suite. The role involves creating and automating infrastructure features of data platforms, collaborating with a distributed team, writing high-quality Python code, and working from home.
Requirements
- Proven hands-on experience in software development using Python
- Proven hands-on experience in distributed systems, such as Kafka and Spark
- Have a Bachelorβs or equivalent in Computer Science, STEM, or a similar degree
- Willingness to travel up to 4 times a year for internal events
Responsibilities
- Collaborate proactively with a distributed team
- Write high-quality, idiomatic Python code to create new features
- Debug issues and interact with upstream communities publicly
- Work with helpful and talented engineers including experts in many fields
- Discuss ideas and collaborate on finding good solutions
Benefits
- Fully remote working environment - weβve been working remotely since 2004!
- Personal learning and development budget of 2,000USD per annum
- Annual compensation review
- Recognition rewards
- Annual holiday leave
- Parental Leave
- Employee Assistance Programme
Share this job:
Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.
Similar Remote Jobs
- π°$190k-$267kπUnited States
- π°$190k-$267kπUnited States
- πUnited States
- π°$217k-$255kπUnited States
- πUnited States
- π°$225k-$255kπUnited States
- πUnited States
- πUnited Kingdom, Spain
- πWorldwide
- π°$86k-$112kπCanada
Please let Canonical know you found this job on JobsCollider. Thanks! π