Senior Big Data Engineer

Logo of Rackspace Technology

Rackspace Technology

πŸ’΅ $116k-$198k
πŸ“Remote - Worldwide

Job highlights

Summary

Join our team as a Senior Big Data Engineer and leverage your expertise in developing batch processing systems using technologies like Hadoop, Oozie, Airflow, and GCP. This remote position requires strong communication and problem-solving skills. You will develop scalable code, manage data workflows, and implement DevOps best practices. The ideal candidate possesses extensive experience in cloud-based batch processing and a strong background in Java and Python. This role offers competitive compensation and benefits, with pay ranges varying by location.

Requirements

  • Bachelor's degree in Computer Science, software engineering or related field of study
  • Experience with managed cloud services and understanding of cloud-based batch processing systems are critical
  • Proficiency in Oozie, Airflow, Map Reduce, Java
  • Strong programming skills with Java (specifically Spark), Python, Pig, and SQL
  • Expertise in public cloud services, particularly in GCP
  • Proficiency in the Apache Hadoop ecosystem with Oozie, Pig, Hive, Map Reduce
  • Familiarity with BigTable and Redis
  • Experienced in Infrastructure and Applied DevOps principles in daily work. Utilize tools for continuous integration and continuous deployment (CI/CD), and Infrastructure as Code (IaC) like Terraform to automate and improve development and release processes
  • Proven experience in engineering batch processing systems at scale
  • 5+ years of experience in customer-facing software/technology or consulting
  • 5+ years of experience with β€œon-premises to cloud” migrations or IT transformations
  • 5+ years of experience building, and operating solutions built on GCP (ideally) or AWS/Azure
  • Proficiency in Oozie, Airflow, Map Reduce, Java

Responsibilities

  • Develop scalable and robust code for batch processing systems. This includes working with technologies like Hadoop, Oozie, Pig, Hive, Map Reduce, Spark (Java), Python, Hbase
  • Develop, Manage and optimize data workflows using Oozie and Airflow within the Apache Hadoop ecosystem
  • Leverage GCP for scalable big data processing and storage solutions
  • Implementing automation/DevOps best practices for CI/CD, IaC, etc

Benefits

  • The anticipated starting pay range for Colorado is: $116,100 - $170,280
  • The anticipated starting pay range for the states of Hawaii and New York (not including NYC) is: $123,600 - $181,280
  • The anticipated starting pay range for California, New York City and Washington is: $135,300 - $198,440
  • Unless already included in the posted pay range and based on eligibility, the role may include variable compensation in the form of bonus, commissions, or other discretionary payments
  • Remote work

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.
Please let Rackspace Technology know you found this job on JobsCollider. Thanks! πŸ™