Senior Data Architect

Blue Orange Digital
Summary
Join Blue Orange Digital as a Databricks Data Architect to lead Lakehouse design, governance, and client engagement. You will collaborate with delivery squads, model data, tune clusters, and enforce standards. Responsibilities include leading discovery workshops, shaping solutions, supporting pre-sales, and mentoring client technical leads. You will design logical/physical models, architect multi-cloud solutions, and define governance and security measures. Performance and cost optimization, implementation leadership, and cross-functional collaboration are also key aspects of this role. The ideal candidate possesses 5-7 years of cloud data platform experience, including 3+ years with Databricks, and a deep understanding of Delta Lake, Unity Catalog, and Spark performance tuning.
Requirements
- 5–7 years building cloud data platforms; 3+ years hands-on with Databricks
- Deep expertise in Delta Lake ACID, Unity Catalog and Spark performance tuning
- Proven experience architecting Lakehouse or Cloud DW solutions on two or more major clouds
- Strong SQL + PySpark/Scala; working knowledge of dbt, Airflow or similar orchestrators
- Databricks Data Engineer Professional certification (or ability to earn in 90 days)
- Excellent communication skills for client workshops, documentation, and mentoring
- Ability to engage with and communicate effectively with clients at all levels developing technical solutions that solve their challenges and/or advance their interests
- Bachelor’s degree or higher in Computer Science, Engineering, Data Science, or related field, or equivalent experience
- Ability to translate complex technical concepts into understandable terms; adept at engaging and influencing senior management and non-technical stakeholders
- Exceptional communication, presentation, and interpersonal skills, particularly adept at conveying complex technical concepts effectively to non-technical audiences with ease
- Self-directed and motivated with a results-driven approach, capable of achieving deliveries and outcomes independently with limited external direction
- Bachelor’s degree or higher in Computer Science, Engineering, IT, Data Science, or a related field
- Eager to learn and adapt in a rapidly evolving tech landscape
- Ability and willingness to travel as required to meet clients and attend industry events
Responsibilities
- Act as the primary technical liaison during key engagements—translating business goals into architecture that both sides understand
- Lead discovery workshops and roadmap sessions to surface requirements, constraints and success metrics, then map them to scalable Databricks patterns
- Partner with account & sales teams to shape estimates, reference architectures, and bill-of-materials for proposals and SOWs
- Provide architecture-level answers for RFPs/RFIs and join pitch calls when deep Databricks credibility is essential
- Mentor client technical leads during early project phases to ensure knowledge transfer and long-term success
- Design logical/physical models, storage layers and streaming/CDC patterns with Delta Lake and Unity Catalog
- Architect multi-cloud Databricks solutions (AWS, Azure, GCP) covering ETL/ELT, structured streaming and governance zones
- Define catalog/permission models, retention policies and lineage artifacts to meet HIPAA, SOC 2, GDPR and similar frameworks
- Implement row-/column-level security, tokenization and end-to-end audit logging
- Tune cluster sizing, Photon/SQL Warehouse configs, Z-Ordering and auto-compaction to hit SLA and cost targets
- Instrument dashboards for query latency, job runtimes and spend
- Lead design reviews, pair with engineers on PySpark/Scala, and sign off on pull-requests before production
- Publish best-practice templates, Terraform workspace bootstraps and CI/CD guidelines
- Work closely with Platform Ops, Security, Analytics and Product teams to translate requirements into production-ready data solutions
- Host lunch-and-learns and brown-bag demos to level-up Databricks skill-sets across Blue Orange
Preferred Qualifications
- Experience as a Databricks Champion within your organization
- Experience migrating legacy Hadoop/Snowflake/Redshift to Lakehouse
- Familiarity with MLflow, Feature Store and Databricks Model Serving
- DataOps/CI-CD for notebooks and IaC (Terraform, Azure DevOps, GitHub Actions)
- Domain depth in one of our focus verticals (FinTech, Sports Analytics, Manufacturing, etc.)
- Experience with transactional data systems and stacks, such as Java, Spring Boot, Kafka, SQL Server, Postgres, MongoDB, as well as microservices, message queues, actor-models, event-driven architectures, etc
- Experience consulting in any of the following vertical industries: Financial Services Healthcare Retail/CPG Manufacturing Travel & Hospitality
- Experience working with ERP systems such as SAP, Oracle Netsuite, Microsoft Dynamics, JD Eduards, Oracle, Sage, Workday, etc
- Engineering certifications in Databricks (beyond pro), Azure, AWS, GCP, Snowflake and related tools
- Experience serving as a consultory liaison between clients and our technical teams. Engage with senior-level stakeholders to understand their business challenges and articulate clear, compelling technical solutions aligned with their strategic goals
- Self-starter, proven abilities leading complex client engagement deliveries, often with ambiguity and little direction
- Masters, MBA or other advanced degree a plus
Benefits
- 401k Matching
- Unlimited PTO
- 100% remote role with an option for hybrid
- Healthcare, Dental, Vision, and Life Insurance
- Paid parental/bereavement leave
- Home office stipend
Share this job:
Similar Remote Jobs
