Senior Specialist Solutions Architect

Databricks Logo

Databricks

πŸ’΅ $157k-$219k
πŸ“Remote - United States

Summary

Join Databricks as a Specialist Solutions Architect (SSA) - Data Intelligence Platform to guide partners in leveraging our platform and integrating with AWS services for big data solutions. This partner-facing role involves supporting field Solution Architects and partner teams, requiring hands-on experience with AWS, SQL, Apache Sparkβ„’, and other data technologies. You will help partners design and implement essential workloads, aligning their technical roadmaps with the Databricks Intelligence Platform. As a go-to-expert, you'll strengthen technical skills through mentorship and training, establishing expertise in areas like data governance, data science, or machine learning. You will drive product adoption, provide training, translate field trends into cohesive strategies, and provide technical leadership for successful implementations. This role requires collaboration with internal and external stakeholders and contributing to the Databricks community.

Requirements

  • 5+ years experience in a technical role with expertise in at least one of the following on AWS
  • Software Engineering/Data Engineering: data ingestion, streaming technologies - such as Spark Streaming and Kafka, performance tuning, troubleshooting, and debugging Spark or other big data solutions
  • Data Applications Engineering: Build use cases that use data - such as risk modeling, fraud detection, partner life-time value
  • Data Science or Machine Learning Ops: Design and build of production infrastructure, model management, and deployment of advanced analytics that drives measurable business value (ie. getting models running in production)
  • Must be able to work collaboratively and independently to achieve outcomes supporting go-to-market priorities and have the interpersonal savvy to influence both partners and internal stakeholders without direct authority
  • Deep Specialty Expertise in at least one of the following areas
  • Expertise in data governance systems and solutions that may span technologies such as Unity Catalog, Alation, Collibra, Purview, etc
  • Experience with high-performance, production data processing systems (batch and streaming) on distributed infrastructure
  • Experience building large-scale real-time stream processing systems; expertise in high-volume, high-velocity data ingestion, change data capture, data replication, and data integration technologies
  • Experience migrating and modernizing Hadoop jobs to public cloud data lake platforms, including data lake modeling and cost optimization
  • Expertise in cloud data formats like Delta and declarative ETL frameworks like DLT
  • Expertise in building GenAI solutions such as RAG, Finetuning, or Pre-training for custom model creation
  • Bachelor's degree in Computer Science, Information Systems, Engineering, or equivalent experience through work experience
  • Maintain and extend production data systems to evolve with complex needs
  • Production programming experience in SQL and Python, Scala, or Java
  • Experience with the AWS cloud
  • 3+ years professional experience with Big Data technologies (Ex: Spark, Hadoop, Kafka) and architectures
  • 3+ years with system integration partner or customer-facing experience in a pre-sales or post-sales role (consultant working for a partner)
  • Can meet expectations for technical training and role-specific outcomes within 6 months of hire

Responsibilities

  • Drive adoption and grow knowledge of Databricks products and accelerators on AWS by energizing the ecosystem of system integration partners, AWS technical field consultants, and Databricks direct field
  • Provide tutorials and training to improve partner community adoption (including workshops, hackathons, and conference presentations)
  • Translate field trends, AWS priorities, and Databricks product strategy into a cohesive story with clear call out for where we leverage both sides to build customer value in order to deliver that story
  • Provide technical leadership to guide strategic partners to successful implementations on big data projects, ranging from architectural design to data engineering to model deployment
  • Demonstrate thought leadership and translate customer adoption patterns from the field to collaborate with product teams to consider integrations with AWS
  • Become a technical expert in an area such as the open Lakehouse, big data streaming, or data ingestion and workflows
  • Assist Solution Architects with aspects of the technical sale as they work alongside partners including customizing proof of concept content, and architectures
  • Contribute to the Databricks Community

Preferred Qualifications

This role can be remote, but we prefer that you will be located in the job listing area (Seattle) and can travel up to 30% when needed

Benefits

  • $157,000 β€” $219,775 USD
  • Annual performance bonus
  • Equity

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.

Similar Remote Jobs