NucleusTeq is hiring a
Data Engineer

closed
Logo of NucleusTeq

NucleusTeq

πŸ’΅ $100k-$150k
πŸ“Remote - India

Summary

The job is for a Data and Analytics Engineer at NucleusTeq, a software services company. The role involves developing and deploying data and analytics solutions on GCP, supporting business analysts, designing infrastructure, and participating in an on-call rotation schedule. The position is remote.

Requirements

  • Strong working knowledge of cloud offerings and solutions on Google Cloud Platform
  • Areas of expertise we are looking for - Python, Spark, PySpark, Scala. DataProc, BigQuery, BigTable, Pub/Sub, GKE
  • Deep knowledge of Google computing, storage, logging and monitoring, auditing, security services & DevOps
  • Strong experience with Google Cloud Platform services such as Google Compute (Computer Engine, Kubernetes Engine), Storage (Cloud Storage, buckets, Persistent Disk), Identity (Cloud Identity, Cloud IAM), Security (Cloud Security Command Center, Cloud Security Scanner), Networking (VPC, Cloud Load Balancing), and Management Tools (Stackdriver Overview, Monitoring, Logging)
  • Deep knowledge and understanding of PaaS and IaaS features in Cloud
  • Sound experience with Infrastructure as Code (Terraform, Cloud Run, Cloud Function etc.) and servers, virtualization, and cloud computing

Responsibilities

  • Develop and Deploy data and analytics-led solutions on GCP
  • Support the Business/ Technical Analyst in workshops with LOBs, Application Owners, Application Architects to understand business requirements and develop Functional and Non-Functional Requirements
  • Design and build infrastructure & systems that provide high levels of scalability, reliability, and performance while balancing security, maintainability, reliability and operational excellence
  • Work with Clients Data Engineers, Architect and Product Owner to design the solution details of the applications to be migrated. Ensure that the detailed design conforms to user expectations
  • Design, Build and Test data processing pipelines in a GCP environment using Python, Spark, PySpark, Scala code
  • Provide support with application testing, UAT, and application migration in GCP

Preferred Qualifications

Capable of presenting analyses and recommendations to leadership or discussing the technical merits of solutions with engineers and architects

Benefits

Location- Remote (Work from Home)

This job is filled or no longer available

Similar Jobs