Technical Architect-Analytics

Simpplr Logo

Simpplr

๐Ÿ“Remote - India

Summary

Join Simpplr as a Technical Architect - Analytics and play a critical role in designing, developing, and optimizing our data and analytics architecture. You will define the data strategy, design scalable data pipelines, and implement best practices for real-time and batch analytics solutions. This hands-on role requires a strong technical leader passionate about data engineering, analytics, and driving data-driven decision-making. You will collaborate with various teams, ensuring data security and compliance. The position offers a hybrid work model and is based in Gurgaon or Bangalore, India.

Requirements

  • 8+ years of experience in data architecture, analytics, and big data processing
  • Experience designing and implementing end-to-end data platforms for high-scale applications
  • Expertise in ETL/ELT pipelines, data modeling, data warehousing, and stream processing
  • Experience working with BI tools, data visualization, and reporting platforms
  • Big Data & Analytics: Spark, Kafka, Hadoop, Druid, ClickHouse, Presto, Snowflake, Redshift, BigQuery
  • Databases: PostgreSQL, MongoDB, Cassandra, ElasticSearch
  • Cloud Platforms: AWS, GCP, Azure (experience with cloud data warehouses like AWS Redshift, Snowflake is a plus)
  • Programming & Scripting: Python, SQL, Java, Scala
  • Understanding of real-time event processing architectures
  • Ability to design and implement long-term data strategies aligned with business goals
  • Strong analytical skills with a deep understanding of performance tuning for large-scale data systems
  • Ability to think strategically and drive engineering excellence within the team
  • Strong interpersonal and communication skills to collaborate effectively across teams
  • An eye for detail with the ability to translate ideas into tangible, impactful outcomes
  • Comfortable managing and delivering work in a fast-paced, dynamic environment

Responsibilities

  • Define and own the architecture for data processing, analytics, and reporting systems, ensuring scalability, reliability, and performance
  • Design and implement highly efficient, scalable, and reliable data pipelines for structured and unstructured data
  • Architect and optimize data processing workflows for batch, real-time, and streaming analytics
  • Work closely with Product Managers, Data Scientists, Analysts, and Software Engineers to translate business requirements into scalable data architectures
  • Stay ahead of industry trends, introduce modern data technologies, and drive best practices in data architecture, governance, and security
  • Review code, enforce data engineering best practices, and mentor engineers to build a high-performance analytics team
  • Ensure data security, integrity, and compliance with regulations (GDPR, CCPA, etc.)
  • Identify performance bottlenecks in data pipelines and analytics workloads, optimizing for cost, speed, and efficiency
  • Lead cloud-based data platform initiatives, ensuring high availability, fault tolerance, and cost optimization

Preferred Qualifications

  • Hands-on experience with AWS Public Cloud
  • Experience with Machine Learning Pipelines and AI-driven analytics
  • Hands-on experience with Kubernetes, Terraform, and Infrastructure-as-Code (IaC) for data platforms
  • Certifications in AWS Data Analytics, Google Professional Data Engineer, or equivalent
  • Experience with data security, encryption, and access control mechanisms
  • Experience in Event/Data Streaming platforms
  • Experience in risk management and compliance frameworks

Benefits

Hybrid work from home and office

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.