Architect

Tiger Analytics
Summary
Join Tiger Analytics, a rapidly expanding advanced analytics consulting firm, as a Databricks Architect. You will be responsible for designing and implementing best data practices, leveraging your expertise in Databricks Lakehouse, AWS, Snowflake, and Apache Iceberg. This role demands strong MLOps & CI/CD expertise and proficiency in Python/Scala (Spark). You will architect and strategize Databricks on AWS/Snowflake integration, design cost-managed multi-region Databricks MLOps platforms, and implement Unity Catalog for secure access control. The position offers significant career development opportunities in a challenging and entrepreneurial environment. Remote work is possible.
Requirements
- Strong Experience in Data Architecture, specifically with Databricks Architecture
- Expert in Databricks Lakehouse (Delta Lake, Unity Catalog, MLflow), AWS, Snowflake, and Apache Iceberg
- Strong MLOps & CI/CD Expertise
- Proficient in Python/Scala (Spark) for data governance, security, and enterprise data platform design
Responsibilities
- Architect and strategize Databricks on AWS/Snowflake integration for secure, scalable data/AI platforms
- Architect seamless Snowflake/Databricks data flow via Apache Iceberg (including ML output)
- Design cost-managed multi-region Databricks MLOps platform and data flows
- Implement Unity Catalog for fine-grained access control and secure team coexistence with Workspaces
- Develop Databricks-native MLOps (environment parity, Git-driven CI/CD, governed access, MLflow governance, standardized deployment, monitoring)
- Define enterprise-scalable AI governance for GenAI production deployment
Preferred Qualifications
Databricks certifications and GenAI architecture experience
Share this job:
Similar Remote Jobs
