SIEM Security Engineer

Logo of HashiCorp

HashiCorp

πŸ“Remote - Canada

Job highlights

Summary

Join HashiCorp's Threat Detection and Response Team as a Data Engineer to enhance security across our products and enterprise. You will be responsible for expanding visibility across major cloud providers, ensuring accurate records of actions performed. This role involves partnering with engineering and stakeholders to define and drive secure environments. The team is heavily invested in tooling and automation, requiring continuous improvement. While remote work experience isn't mandatory, independence and autonomy are key. The position requires experience with large-scale data collection in the cloud and various data processing techniques.

Requirements

  • 2+ years in an engineering role focused on large scale data collection in the cloud, using cloud-native tooling
  • Working knowledge of batch or streaming data processing pipelines
  • Working knowledge of patterns of information retrieval and optimizing query workload
  • Experience working with multiple data query models
  • Natural curiosity and an interest in Threat Detection, Incident Response, Fraud, and/or Threat Intel problem space and the desire to be exposed to and develop these skill areas while serving in a development-focused role
  • Publicly released tools or modules or open source contributions
  • Solid foundation of Linux and exposure linux in cloud provider environments

Responsibilities

  • Ensure best practices are implemented across our multi-cloud environment
  • Partner with engineering and other stakeholders to define and drive secure by default environments supporting our products and the enterprise
  • Continually improve tooling and automation areas
  • Collect, Normalize, Tag, Enrich
  • Windowing and time series transformation
  • Develop aggregates, views, summaries, and indices to accelerate access to data
  • Profiling query workloads using query planner output or other diagnostic tooling to identify performance bottlenecks
  • Profiling resource consumption to optimize expenditure on storage and transit
  • Planning, dispatching, and monitoring query workload to ensure on-time delivery of information with optimal use of resources
  • Maintaining and evolving shared query content through source code management practices
  • Take a periodic on-call rotation in a distributed team

Preferred Qualifications

  • Experience with Python, Go or experience with other languages and willingness to learn
  • Experience with Terraform, Vault, Packer
  • Experience with AWS, GCP, Azure
  • Experience with AWS EC2, Lambda, Step Functions, ECR/ECS/EKS, S3
  • Experience with Logging Infrastructure and ETL Pipelines - fluentd, logstash, vector, kafka, kinesis or similar
  • Experience with CI/CD - Building pipelines involving Jenkins, CircleCI, GH Actions, etc

Benefits

Engineering at HashiCorp is largely a remote team

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.

Similar Remote Jobs

Please let HashiCorp know you found this job on JobsCollider. Thanks! πŸ™