Remote Senior Data Engineer

closed
Logo of Rackspace Technology

Rackspace Technology

πŸ’΅ $100k-$200k
πŸ“Remote - United States of Vietnam

Job highlights

Summary

The job is for a Data Platform Engineer who will design and build resilient data pipelines, architect data infrastructure, collaborate with various teams, and lead a team. The role requires extensive experience with Azure data services, strong programming skills, knowledge of data modeling, Agile development methods, and certifications on Azure Data Engineering.

Requirements

  • 7+ years of data engineering experience leading implementations of large-scale lakehouses on Databricks, Snowflake, or Synapse
  • 3+ years' experience architecting solutions for developing data pipelines from structured, unstructured sources for batch and realtime workloads
  • Extensive experience with Azure data services (Databricks, Synapse, ADF) and related azure infrastructure services like firewall, storage, key vault etc
  • Strong programming / scripting experience using SQL and python and Spark
  • Strong Data Modeling, Data lakehouse concepts
  • Knowledge of software configuration management environments and tools such as JIRA, Git, Jenkins, TFS, Shell, PowerShell, Bitbucket
  • Experience with Agile development methods in data-oriented projects
  • Highly motivated self-starter and team player and demonstrated success in prior roles
  • Track record of success working through technical challenges within enterprise organizations
  • Ability to prioritize deals, training, and initiatives through highly effective time management
  • Excellent problem solving, analytical, presentation, and whiteboarding skills
  • Track record of success dealing with ambiguity (internal and external) and working collaboratively with other departments and organizations to solve challenging problems
  • Strong knowledge of technology and industry trends that affect data analytics decisions for enterprise organizations
  • Certifications on Azure Data Engineering and related technologies

Responsibilities

  • Design and build resilient and efficient data pipelines for batch and real-time streaming
  • Architect and design data infrastructure on cloud using Infrastructure-as-Code tools
  • Collaborate with product managers, software engineers, data analysts, and data scientists to build scalable and data-driven platforms and tools
  • Provide technical product expertise, advise on deployment architectures, and handle in-depth technical questions around data infrastructure, PaaS services, design patterns and implementation approaches
  • Collaborate with enterprise architects, data architects, ETL developers & engineers, data scientists, and information designers to lead the identification and definition of required data structures, formats, pipelines, metadata, and workload orchestration capabilities
  • Address aspects such as data privacy & security, data ingestion & processing, data storage & compute, analytical & operational consumption, data modeling, data virtualization, self-service data preparation & analytics, AI enablement, and API integrations
  • Lead a team of engineers to deliver impactful results at scale
  • Execute projects with an Agile mindset
  • Build software frameworks to solve data problems at scale
This job is filled or no longer available