Software Engineer, Data Infrastructure - Kafka
Canonical
πRemote - United States
Please let Canonical know you found this job on JobsCollider. Thanks! π
Job highlights
Summary
Join Canonical's data platform team to develop managed solutions for various data stores and technologies, and contribute to building a comprehensive automation suite. The role involves creating and automating infrastructure features of data platforms, collaborating with a distributed team, writing high-quality Python code, and working from home.
Requirements
- Proven hands-on experience in software development using Python
- Proven hands-on experience in distributed systems, such as Kafka and Spark
- Have a Bachelorβs or equivalent in Computer Science, STEM, or a similar degree
- Willingness to travel up to 4 times a year for internal events
Responsibilities
- Collaborate proactively with a distributed team
- Write high-quality, idiomatic Python code to create new features
- Debug issues and interact with upstream communities publicly
- Work with helpful and talented engineers including experts in many fields
- Discuss ideas and collaborate on finding good solutions
Benefits
- Fully remote working environment - weβve been working remotely since 2004!
- Personal learning and development budget of 2,000USD per annum
- Annual compensation review
- Recognition rewards
- Annual holiday leave
- Parental Leave
- Employee Assistance Programme
Share this job:
Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.
Similar Remote Jobs
- πUnited States
- π°$217k-$255kπUnited States
- πUnited States
- π°$225k-$255kπUnited States
- πUnited States
- π°$130k-$160kπWorldwide
- πCanada
- πUnited States
- π°$145k-$180kπWorldwide
- π°$160k-$180kπUnited States
Please let Canonical know you found this job on JobsCollider. Thanks! π