Cloud Software Engineer

Egen Logo

Egen

πŸ’΅ $120k-$150k
πŸ“Remote - Worldwide

Summary

Join Egen, a fast-growing data-first company, as a fully remote Cloud Engineer. Design and implement cloud-native applications on Google Cloud Platform (GCP), developing and maintaining microservices architecture. Develop modern data pipelines using Apache Airflow and apply scaling principles to ensure system robustness. Collaborate with teams to build high-performance software solutions using Python, Node.js, or Java. This role requires 4+ years of GCP experience and 5+ years of software development experience. Egen offers a competitive salary and comprehensive benefits package.

Requirements

  • 4+ years of professional experience with GCP, or AWS services around databases, messaging framework (Pub/Sub, SNS/SQS, Kafka), APIs, functions (Cloud Run, AWS Lambda), and containers
  • 5+ years of software development experience, with strong skills in Python, Node.JS, or Java
  • Experience with Apache Airflow for data pipeline orchestration
  • Proficiency in microservices architecture, API design, and understanding of distributed systems, including CAP theorem trade-offs
  • Experienced with building applications with Apache Kafka, Elasticsearch, Redis on the Kubernetes or VM clusters
  • Experience in software lifecycle best practices, such as unit testing, static code analysis, and incremental refactoring
  • Solid understanding of troubleshooting and reductionist techniques in software development
  • Excellent communication skills and ability to work collaboratively in a team environment
  • Experience with Mysql, PostgreSQL, Bitbucket, GitHub
  • Knowledge of IAM and JWT authorization, SSO/OAuth (Azure AD, Okta)
  • Knowledgeable of 12-factor app methodology and how they should be applied

Responsibilities

  • Design and implement cloud-native applications in a serverless environment such as Google Cloud Run or Kubernetes
  • Develop and maintain microservices architecture, considering API design and trade-offs. Ensure adherence to distributed systems principles, understanding CAP theorem implications on compute and data workloads
  • Develop modern data pipelines using Apache Airflow
  • Apply scaling principles to ensure system robustness, including load estimation, failure management, rate limiting, and quota management
  • Develop and scale high-performance software solutions using Python, Node.js , or Java
  • Collaborate with teams to architect, develop, and troubleshoot scalable and reliable solutions on GCP
  • Employ reductionist techniques for effective problem-solving and system optimization

Benefits

  • Comprehensive Health Insurance
  • Paid Leave (Vacation/PTO)
  • Paid Holidays
  • Sick Leave
  • Parental Leave
  • Bereavement Leave
  • 401 (k) Employer Match
  • Employee Referral Bonuses

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.