Solutions Architect, Data Engineering
Databricks
Job highlights
Summary
Join Databricks as a Specialist Solutions Architect (SSA) - Data Engineering and guide customers in building big data solutions on the Databricks platform. This customer-facing role requires hands-on experience with Apache Sparkβ’ and expertise in other data technologies. You will help customers design and implement essential workloads, aligning their technical roadmap with the Databricks Data Intelligence Platform. As a go-to expert, you'll strengthen your technical skills through mentorship and training, establishing yourself in a specialty area. You will provide technical leadership, architect data pipelines, become a technical expert in a specific area, assist Solution Architects, and contribute to the Databricks community. The role offers opportunities for professional growth and development within a dynamic and innovative company.
Requirements
- 5+ years experience in a technical role with expertise in at least one of the following: Software Engineering/Data Engineering: data ingestion, streaming technologies - such as Spark Streaming and Kafka, performance tuning, troubleshooting, and debugging Spark or other big data solutions
- Data Applications Engineering: Build use cases that use data - such as risk modeling, fraud detection, customer life-time value
- Extensive experience building big data pipelines
- Experience maintaining and extending production data systems to evolve with complex needs
- Bachelor's degree in Computer Science, Information Systems, Engineering, or equivalent experience through work experience
- Production programming experience in SQL and Python, Scala, or Java
- 2 years professional experience with Big Data technologies (Ex: Spark, Hadoop, Kafka) and architectures
- 2 years customer-facing experience in a pre-sales or post-sales role
- Can meet expectations for technical training and role-specific outcomes within 6 months of hire
- Can travel up to 30% when needed
Responsibilities
- Provide technical leadership to guide strategic customers to successful implementations on big data projects, ranging from architectural design to data engineering to model deployment
- Architect production level data pipelines, including end-to-end pipeline load performance testing and optimization
- Become a technical expert in an area such as data lake technology, big data streaming, or big data ingestion and workflows
- Assist Solution Architects with more advanced aspects of the technical sale including custom proof of concept content, estimating workload sizing, and custom architectures
- Provide tutorials and training to improve community adoption (including hackathons and conference presentations)
- Contribute to the Databricks Community
Preferred Qualifications
- Deep Specialty Expertise in at least one of the following areas: Experience scaling big data workloads (such as ETL) that are performant and cost-effective
- Experience migrating Hadoop workloads to the public cloud - AWS, Azure, or GCP
- Experience with large scale data ingestion pipelines and data migrations - including CDC and streaming ingestion pipelines
- Expert with cloud data lake technologies - such as Delta and Delta Live Tables
Benefits
- Annual performance bonus
- Equity
Share this job:
Similar Remote Jobs
- π°$124k-$220kπUnited States
- π°$139k-$247kπUnited States
- πUnited States
- πUnited Kingdom
- πUnited Kingdom
- π°$180k-$200kπUnited States
- πPeru
- πPoland
- π°$139k-$247kπUnited States