Remote Solution Architect

Logo of Unison Consulting

Unison Consulting

πŸ“Remote - India

Job highlights

Summary

Join our team as a Solution Architect and design technical solutions to meet business requirements. Work closely with stakeholders to translate needs into scalable technology solutions.

Requirements

  • Intermediate to Advanced working knowledge of Spark
  • Hands-on experience with advanced SQL, and at least one of Python / Scala / Java
  • Expertise in at least one of the three data personas: - Data Engineering, Data Analytics, and BI or ML Engineering
  • Hands-on experience with at least one of the Public Clouds (AWS / Azure / GCP) and its native Data "Managed Services"
  • Familiarity with the core concepts of cloud architecture
  • Experience in implementation/migration projects on Databricks and Databricks Certification

Responsibilities

  • Designing and implementing technical solutions to meet the business requirements of the organization
  • Working closely with business stakeholders, project managers, developers, and other IT teams
  • Translating business needs into scalable, reliable, and high-performance technology solutions
  • Ensuring that the technical solutions align with the overall enterprise architecture and adhere to best practices and industry standards

Preferred Qualifications

  • Experience in Data Orchestration tools or Data Ingestions tools
  • Performance tuning skills on Spark, query tuning skills on distributed query engines
  • Cloud best practices around Data movement, governance frameworks
  • Dev Ops on Cloud, experience with Infa-as-code solutions like Terraform or CloudFormation

Job description

Job Summary: The Solution Architect is responsible for designing and implementing technical solutions to meet the business requirements of the organization. They work closely with business stakeholders, project managers, developers, and other IT teams to translate business needs into scalable, reliable, and high-performance technology solutions. The Solution Architect is responsible for ensuring that the technical solutions align with the overall enterprise architecture and adhere to best practices and industry standards.

Experience in implementation/migration projects on Databricks and Databricks Certification are big positives

Experience in implementation/migration projects on Databricks and Databricks Certification are big positives

Must-have

  1. Intermediate to Advanced working knowledge of Spark (Just traditional Hadoop bases experience will not be sufficient)
  2. Hands-on experience with advanced SQL, and at least one of Python / Scala / Java
  3. Expertise in at least one of the three data personas: - Data Engineering, Data Analytics, and BI or ML Engineering
  4. Hands-on experience with at least one of the Public Clouds (AWS / Azure / GCP) and its native Data “Managed Services” ( Examples: EMR / Athena / SageMaker/ Synapse / Azure ML / Data Proc / Big Query etc)
  5. Familiarity with the core concepts of cloud architecture (fundamentals of cloud networking, cloud object stores, security & IAM etc )
  6. Experience in implementation/migration projects on Databricks and Databricks Certification are big positives

Good-to-have

  1. Experience in Data Orchestration tools or Data Ingestions tools
  2. Performance tuning skills on Spark, query tuning skills on distributed query engines
  3. Cloud best practices around Data movement, governance frameworks
  4. Dev Ops on Cloud, experience with Infa-as-code solutions like Terraform or CloudFormation.

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.
Please let Unison Consulting know you found this job on JobsCollider. Thanks! πŸ™