Specialist Solutions Architect

Databricks
Summary
Join Databricks as a Specialist Solutions Architect (SSA) - Data Intelligence Platform and guide partners in building big data solutions on Databricks. This partner-facing role requires hands-on experience with SQL, Apache Sparkβ’, and other data technologies. You will provide technical leadership, architect data pipelines, and become a technical expert in an area like data lake technology or big data streaming. Responsibilities include assisting Solution Architects, providing training, and contributing to the Databricks community. The ideal candidate possesses 5+ years of technical experience, expertise in data engineering or data science, and deep specialty expertise in areas like data governance. A Bachelor's degree in a relevant field or equivalent experience is required. This remote-friendly role offers a competitive salary and benefits package.
Requirements
- 5+ years experience in a technical role with expertise in at least one of the following: Software Engineering/Data Engineering: data ingestion, streaming technologies - such as Spark Streaming and Kafka, performance tuning, troubleshooting, and debugging Spark or other big data solutions
- Data Applications Engineering: Build use cases that use data - such as risk modeling, fraud detection, partner life-time value
- Data Science or Machine Learning Ops: Design and build of production infrastructure, model management, and deployment of advanced analytics that drives measurable business value (ie. getting models running in production)
- Must be able to work collaboratively and independently to achieve outcomes supporting go-to-market priorities and have the interpersonal savvy to influence both partners and internal stakeholders without direct authority
- Deep Specialty Expertise in at least one of the following areas: Expertise in data governance systems and solutions that may span technologies such as Unity Catalog, Alation, Collibra, Purview, etc
- Experience with high-performance, production data processing systems (batch and streaming) on distributed infrastructure
- Experience building large-scale real-time stream processing systems; expertise in high-volume, high-velocity data ingestion, change data capture, data replication, and data integration technologies
- Experience migrating and modernizing Hadoop jobs to public cloud data lake platforms, including data lake modeling and cost optimization
- Expertise in cloud data formats like Delta and declarative ETL frameworks like DLT
- Expertise in building GenAI solutions such as RAG, Finetuning, or Pre-training for custom model creation
- Bachelor's degree in Computer Science, Information Systems, Engineering, or equivalent experience through work experience
- Production programming experience in SQL and Python, Scala, or Java
- Experience with the AWS, Azure, or GCP clouds
- 3+ years professional experience with Big Data technologies (Ex: Spark, Hadoop, Kafka) and architectures
- 3+ years with system integration partner or customer-facing experience in a pre-sales or post-sales role (consultant working for a partner)
- Can meet expectations for technical training and role-specific outcomes within 6 months of hire
Responsibilities
- Provide technical leadership to guide strategic partners to successful implementations on big data projects, ranging from architectural design to data engineering to model deployment
- Architect production level data pipelines, including end-to-end pipeline load performance testing and optimization
- Become a technical expert in an area such as data lake technology, big data streaming, or big data ingestion and workflows
- Assist Solution Architects with aspects of the technical sale as they work alongside partners including customizing proof of concept content, estimating workload sizing, and custom architectures
- Provide tutorials and training to improve partner community adoption (including hackathons and conference presentations)
- Contribute to the Databricks Community
- Maintain and extend production data systems to evolve with complex needs
Preferred Qualifications
This role can be remote, but we prefer that you will be located in the job listing area and can travel up to 30% when needed
Benefits
- Annual performance bonus
- Equity