Skyflow is hiring a
Senior Backend Software Engineer

Logo of Skyflow

Skyflow

πŸ’΅ $150k-$200k
πŸ“Remote - United States

Summary

The job is for a Senior Backend Software Engineer focusing on Data Pipelines at Skyflow, a data privacy vault company. The role involves designing and developing complex data processing workflows in cloud-native environments using technologies like Kafka, Docker, and Kubernetes. The position offers remote work, competitive salary, health insurance, dental insurance, vision insurance, 401k, generous PTO, flexible hours, equity, and work from home expense for eligible countries.

Requirements

  • Solid experience with Golang (Must-Have) developing highly-available and scalable production-level code to build microservices, batch architectures, and lambda expressions
  • Excellent understanding of data structures; You’re not challenged by keeping a flow of data structures in-memory
  • Experience working with multiple file formats (CSV, JSON, Parquet, Avro, Delta Lake et al.)
  • Knowledge of data warehouse technical architectures, Docker and Kubernetes infrastructure components, and how to develop secure ETL pipelines
  • Experience in pub/sub modes like Kafka
  • Experience working in a Big Data environment (Hadoop, Spark, Hive, Redshift et al.)
  • Experience with relational and non-relational databases
  • Experience building real-time streaming data pipelines is a plus

Responsibilities

  • Containerize each component of the data pipeline (like ETL processes, databases, data processing applications) using Docker to create Dockerfiles, and Docker Images
  • Set up Kubernetes clusters to manage and orchestrate Docker Containers and deploy Pods, as well as create services and load balancing policies
  • Use Kubernetes Volumes for managing data and stateful applications, ensuring that data persists beyond the lifespan of individual Pods
  • Configure Kafka for scalability, ensuring it can handle high volumes of data streams efficiently
  • Configure Kafka brokers, topics, producers, and consumers, as well as use Kafka Connect to integrate with external databases, systems, or other data sources/sinks
  • Implement logging and monitoring solutions to keep track of the health and performance of your data pipelines
  • Troubleshoot connectivity issues to common datastores such as Amazon S3 and Azure Data Lake
  • Implement network policies in Kubernetes for secure communication between different services
  • Follow best practices for security, such as securing Kafka clusters and implementing proper access controls

Benefits

  • Work from home expense (U.S., Canada, and Australia)
  • Excellent Health, Dental, and Vision Insurance Options (Varies by Country)
  • Vanguard 401k
  • Very generous PTO
  • Flexible Hours
  • Generous Equity

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.

Similar Jobs

Please let Skyflow know you found this job on JobsCollider. Thanks! πŸ™