Rackspace Technology is hiring a
Sr Big Data Engineer in United States

Logo of Rackspace Technology
Sr Big Data Engineer
🏢 Rackspace Technology
💵 $120k-$180k
📍United States
📅 Posted on Jul 1, 2024

Summary

The job description is for a Senior Big Data Engineer role requiring expertise in the Apache Hadoop ecosystem, GCP, and experience with Oozie, Airflow, Java, Python, Pig, SQL. The role involves developing batch processing systems, optimizing data workflows, implementing automation/DevOps best practices, and working remotely.

Requirements

  • Experience with GCP managed services and understanding of cloud-based batch processing systems are critical
  • Proficiency in Oozie, Airflow, Map Reduce, Java
  • Strong programming skills with Java (specifically Spark), Python, Pig, and SQL
  • Expertise in public cloud services, particularly in GCP
  • Proficiency in the Apache Hadoop ecosystem with Oozie, Pig, Hive, Map Reduce
  • Familiarity with BigTable and Redis
  • Experienced in Infrastructure and Applied DevOps principles in daily work. Utilize tools for continuous integration and continuous deployment (CI/CD), and Infrastructure as Code (IaC) like Terraform to automate and improve development and release processes
  • Ability to tackle complex challenges and devise effective solutions. Use critical thinking to approach problems from various angles and propose innovative solutions
  • Worked effectively in a remote setting, maintaining strong written and verbal communication skills. Collaborate with team members and stakeholders, ensuring clear understanding of technical requirements and project goals
  • Proven experience in engineering batch processing systems at scale
  • Hands-on experience in public cloud platforms, particularly GCP. Additional experience with other cloud technologies is advantageous
  • Google Associate Cloud Engineer Certification or other Google Cloud Professional level certification
  • 10+ years of experience in customer-facing software/technology or consulting
  • 5+ years of experience with ‘on-premises to cloud’ migrations or IT transformations
  • 5+ years of experience building, and operating solutions built on GCP (ideally) or AWS/Azure
  • Technical degree: Computer Science, software engineering or related

Responsibilities

  • Develop scalable and robust code for batch processing systems using technologies like Hadoop, Oozie, Pig, Hive, Map Reduce, Spark (Java), Python, Hbase
  • Develop, Manage and optimize data workflows using Oozie and Airflow within the Apache Hadoop ecosystem
  • Leverage GCP for scalable big data processing and storage solutions
  • Implementing automation/DevOps best practices for CI/CD, IaC, etc
Help us out by mentioning to Rackspace Technology that you discovered this job opportunity on JobsCollider. Your support is greatly appreciated. Thank you 🙏
Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.

Similar Jobs