DevOps Engineer

Logo of phData

phData

πŸ“Remote - India

Job highlights

Summary

Join phData, a leading modern data stack company, and become a DevOps Engineer in our growing Bangalore, India office. We offer a competitive compensation plan including base salary, annual bonus, training, certifications, and equity. As a DevOps Engineer, you will manage modern data platforms, learn new technologies, and solve challenging problems. You will work with various cloud platforms and technologies, demonstrating ownership of tasks across multiple customer accounts. We are a remote-first global company with a casual and exciting work environment. We value diversity and inclusion.

Requirements

  • Working knowledge of SQL and the ability to write, debug, and optimize SQL queries
  • Good understanding of writing and optimising Python programs
  • Experience in providing operational support across a large user base for a cloud-native data warehouse (Snowflake and/or Redshift)
  • Experience with cloud-native data technologies in AWS or Azure
  • Proven experience learning new technology stacks
  • Strong troubleshooting and performance tuning skills
  • Client-facing written and verbal communication skills and experience

Responsibilities

  • Operate and manage modern data platforms - from streaming, to data lakes, to analytics, and beyond across a progressively evolving technical stack
  • Ability to learn new technologies in a quickly changing field
  • Owns execution of Tasks and questions around Tasks other Engineers are working on related to the project
  • Responds to Pager Incidents. Solves hard and challenging problems. Goes deep into customer processes and workflows to solve issues
  • Demonstrate clear ownership of tasks on multiple simultaneous customer accounts across a variety of technical stacks
  • Continually grow, learn, and stay up-to-date with the MS technology stack
  • 24/7 rotational shifts

Preferred Qualifications

  • Production experience and certifications in core data platforms such as Snowflake, AWS, Azure, GCP, Hadoop, or Databricks
  • Production experience working with Cloud and Distributed Data Storage technologies such as S3, ADLS, HDFS, GCS, Kudu, ElasticSearch/Solr, Cassandra or other NoSQL storage systems
  • Production experience working with Data integration technologies such as Spark, Kafka, event/streaming, Streamsets, Matillion, Fivetran, HVR, NiFi, AWS Data Migration Services, Azure DataFactory or others
  • Production experience working with Workflow Management and Orchestration such as Airflow, AWS Managed Airflow, Luigi, NiFi
  • Working experience with infrastructure as code using Terraform or Cloud Formation
  • Expertise in scripting language to automate repetitive tasks (preferred Python)
  • Well versed with continuous integration and deployment frameworks with hands-on experience using CI/CD tools like Bitbucket, Github, Flyway, Liquibase
  • Bachelor's degree in Computer Science or a related field

Benefits

  • Medical Insurance for Self & Family
  • Medical Insurance for Parents
  • Term Life & Personal Accident
  • Wellness Allowance
  • Broadband Reimbursement
  • Professional Development Allowance
  • Reimbursement of Skill Upgrade Certifications
  • Certification Reimbursement

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.

Similar Remote Jobs

Please let phData know you found this job on JobsCollider. Thanks! πŸ™