Cloud DevOps Engineer

Global Fishing Watch Logo

Global Fishing Watch

πŸ’΅ $70k-$110k
πŸ“Remote - Canada, United States

Summary

Join Global Fishing Watch as a Cloud DevOps and Software Engineer! This remote role, based in select countries, focuses on ensuring the security and efficiency of our Google Cloud Platform (GCP) infrastructure. You will manage cloud project budgets, address security vulnerabilities, and implement best practices. Responsibilities include infrastructure management using Terraform, creating dashboards for cost and performance monitoring, and providing ad-hoc support to research and engineering teams. The ideal candidate possesses significant DevOps experience, expertise in GCP services, and strong scripting skills. This is a fixed-term position funded through December 2027, with potential for extension.

Requirements

  • Bachelor’s degree in Computer Science, Engineering, or related field (or equivalent professional experience)
  • Significant software development/programming experience as a DevOps Engineer or similar software engineering role
  • Experience working in agile software development teams
  • Experience working with Docker
  • Experience creating and managing infrastructure with Terraform
  • Experience working with Cloud technologies, specifically GCP (Google Cloud Platform)
  • Experience in designing and/or developing infrastructure, configuration, and deployment automation at large scale/high complexity
  • Practical experience with virtualization, storage, and networking
  • Expertise in script languages (Shell, Python)
  • Python or Unix scripting experience
  • Experience in software release management: Git, CI/CD
  • Fluent English speaker
  • Good judgment and the ability to make and justify recommendations
  • Excellent written and oral communication skills in English to effectively collaborate with designers, developers, product managers, and other relevant staff members or clients, and communicate complex technical and scientific information to non-technical audiences
  • Team player, willing to work with, teach and learn from the GFW team
  • Comfortable working in a small but fast-growing organization, with changing instructions and requirements
  • Intellectually curious, forward-thinking, willing to suggest/try new technologies and creative approaches to problems
  • Excellent organizational and time management skills, and the ability to handle multiple projects
  • Disciplined and methodical
  • A problem-solving mindset and excellent troubleshooting skills
  • Demonstrated ability to work remotely and asynchronously

Responsibilities

  • Make sure all cloud project budgets are below the limit
  • Fix security vulnerabilities directly or coordinate with different resource owners. Vulnerabilities include outdated VM OSs and outdated libraries in generated images
  • Continuous clean-up of cloud project resources such as GCP Bigquery datasets, GCP Compute Engine VMs, disks, GCP Cloud Storage, and other services that may be in use
  • Ensure both the research and engineering teams follow Cloud best practices, such as adding labels to all created resources, cleaning up data/virtual machines/etc during and after projects end, keeping VMs off if they are not being used, and using correct networks instead of public IPs
  • Jointly with the CTO and Cloud Security Engineer, keep the Cloud best practices up to date and help distribute updates to those using the cloud
  • Jointly with the Senior DevOps engineer improve current GCS infrastructure and manage tools that are used by the team
  • Work jointly with the Data Team to make sure public resources have the corresponding documentation
  • Create and make sure monitors to review Cloud Compute costs and performance are setup, meaningful, and assigned to the corresponding owners
  • Create and configure resources for the different users using Terraform
  • Bring ideas of how to improve and make the entire cloud setup more efficient
  • Create dashboards to help the different cloud project owners easily understand compute costs, including high-level summary cost dashboards
  • Create dashboards with metrics we want to track, such as the number of incidents reported and fixed, and the number of vulnerabilities per month
  • Send brief information about the Cloud status to the team using the Cloud on a periodic basis to the CTO
  • The CloudDev Ops will also have software engineering tasks and may provide support to both the Research and Engineering teams when some ad-hoc scripting is needed, such as downloading static datasets needed for research

Preferred Qualifications

  • Experience creating Looker dashboards
  • Database administration and SQL experience
  • Experience deploying applications with monorepo architecture
  • Go, JavaScript and R experience
  • NestJS framework experience
  • PostGIS, GDAL, and other GIS tools
  • Experience working with Google Earth Engine
  • Fluent Spanish speaker

Benefits

  • Pension/retirement
  • Health and other benefits commensurate with similar level GFW employees in the country of employment
  • Remote work

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.