Solutions Architect - Data Engineering

Logo of Databricks

Databricks

πŸ’΅ $95k-$168k
πŸ“Remote - Canada

Job highlights

Summary

Join Databricks as a Specialist Solutions Architect (SSA) - Data Engineering and guide customers in building big data solutions on the Databricks platform. This customer-facing role requires hands-on experience with Apache Sparkβ„’ and expertise in other data technologies. You will assist in design and implementation of workloads, aligning customer technical roadmaps with the Databricks Data Intelligence Platform. As a go-to expert, you'll strengthen technical skills through mentorship and training, specializing in areas like streaming, performance tuning, or industry expertise. You will provide technical leadership on big data projects, architect data pipelines, and become a technical expert in a chosen area. The role involves assisting Solution Architects with technical sales, providing tutorials and training, and contributing to the Databricks community.

Requirements

  • 5+ years experience in a technical role with expertise in at least one of the following: Software Engineering/Data Engineering: data ingestion, streaming technologies - such as Spark Streaming and Kafka, performance tuning, troubleshooting, and debugging Spark or other big data solutions
  • Data Applications Engineering: Build use cases that use data - such as risk modeling, fraud detection, customer life-time value
  • Extensive experience building big data pipelines
  • Experience maintaining and extending production data systems to evolve with complex needs
  • Bachelor's degree in Computer Science, Information Systems, Engineering, or equivalent experience through work experience
  • Production programming experience in SQL and Python, Scala, or Java
  • 2 years professional experience with Big Data technologies (Ex: Spark, Hadoop, Kafka) and architectures
  • 2 years customer-facing experience in a pre-sales or post-sales role
  • Can meet expectations for technical training and role-specific outcomes within 6 months of hire
  • Can travel up to 30% when needed

Responsibilities

  • Provide technical leadership to guide strategic customers to successful implementations on big data projects, ranging from architectural design to data engineering to model deployment
  • Architect production level data pipelines, including end-to-end pipeline load performance testing and optimization
  • Become a technical expert in an area such as data lake technology, big data streaming, or big data ingestion and workflows
  • Assist Solution Architects with more advanced aspects of the technical sale including custom proof of concept content, estimating workload sizing, and custom architectures
  • Provide tutorials and training to improve community adoption (including hackathons and conference presentations)
  • Contribute to the Databricks Community

Preferred Qualifications

  • Deep Specialty Expertise in at least one of the following areas: Experience scaling big data workloads (such as ETL) that are performant and cost-effective
  • Experience migrating Hadoop workloads to the public cloud - AWS, Azure, or GCP
  • Experience with large scale data ingestion pipelines and data migrations - including CDC and streaming ingestion pipelines
  • Expert with cloud data lake technologies - such as Delta and Delta Live

Benefits

  • Annual performance bonus
  • Equity

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.

Similar Remote Jobs

Please let Databricks know you found this job on JobsCollider. Thanks! πŸ™