Remote Senior Big Data Hadoop ML Engineer
Rackspace Technology
πRemote - Canada
Please let Rackspace Technology know you found this job on JobsCollider. Thanks! π
Job highlights
Summary
The job description is for a Senior Big Data Engineer role at Rackspace Technology. The ideal candidate will have extensive experience in the Apache Hadoop ecosystem, Java, Python, Spark, and GCP. This remote position requires the ability to develop scalable and robust code for large scale batch processing systems, manage and maintain batch pipelines supporting Machine Learning workloads, and implement automation/DevOps best practices.
Requirements
- Proficiency in the Hadoop ecosystem with Map Reduce, Oozie, Hive, Pig, HBase, Storm
- Strong programming skills with Java, Python, and Spark
- Knowledge in public cloud services, particularly in GCP
- Experienced in Infrastructure and Applied DevOps principles in daily work. Utilize tools for continuous integration and continuous deployment (CI/CD), and Infrastructure as Code (IaC) like Terraform to automate and improve development and release processes
- Ability to tackle complex challenges and devise effective solutions. Use critical thinking to approach problems from various angles and propose innovative solutions
- Worked effectively in a remote setting, maintaining strong written and verbal communication skills. Collaborate with team members and stakeholders, ensuring clear understanding of technical requirements and project goals
- Proven experience in engineering batch processing systems at scale
- Hands-on experience in public cloud platforms, particularly GCP. Additional experience with other cloud technologies is advantageous
- Experience with batch pipelines supporting Machine Learning workloads
- Strong experience in programming language such as Java
- 10+ years of experience in customer-facing software/technology or consulting
- 5+ years of experience with βon-premises to cloudβ migrations or IT transformations
- Technical degree: Computer Science, Software Engineering or related
Responsibilities
- Develop scalable and robust code for large scale batch processing systems using Hadoop, Oozie, Pig, Hive, Map Reduce, Spark (Java), Python, Hbase
- Develop, manage, and maintain batch pipelines supporting Machine Learning workloads
- Leverage GCP for scalable big data processing and storage solutions
- Implementing automation/DevOps best practices for CI/CD, IaC, etc
Preferred Qualifications
- Familiarity with Terraform
- Familiarity with Python
- 5+ years of experience building, and operating solutions built on GCP
Share this job:
Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.
Similar Remote Jobs
- π°$116k-$198kπUnited States
- πUnited States
- πIndia
- πUnited States
- π°$185k-$215kπUnited States
- πIndia
- πUnited States
- πUnited States
- π°$105k-$145kπUnited States
Please let Rackspace Technology know you found this job on JobsCollider. Thanks! π