Senior Software Engineer

Cadmus Logo

Cadmus

πŸ“Remote - India

Summary

Join Cadmus, a global EdTech company, and contribute to our mission of providing high-quality education to 1 billion students by 2050. As a Senior Software Engineer, you will play a key role in developing and maintaining scalable data pipelines, integrating data from various sources, and ensuring our data architecture supports business intelligence and analytics. You will collaborate with cross-functional teams to build and optimize our data infrastructure, focusing on delivering a high-quality user experience. This role requires extensive experience in data engineering, Amazon Redshift, Python, and AWS, along with strong product engineering skills. The position offers a remote-friendly, flexible work culture and opportunities for professional development. We are a small, fast-growing team, and you will need to be highly autonomous and proactive.

Requirements

  • 6+ years of overall experience, with 3+ years in data engineering
  • Expertise in Amazon Redshift, Python, and AWS
  • Hands-on experience with dbt (Data Build Tool) for managing SQL transformations and data models
  • Extensive experience with AWS services such as S3, Lambda, EC2, RDS, and CloudWatch
  • Expertise in data modeling concepts and designing efficient data structures (e.g., star schemas, snowflake schemas) in a data warehouse environment
  • Experience building ETL/ELT pipelines and integrating data from multiple sources, including structured and unstructured data
  • 2+ years of working with large, complex datasets
  • Advanced knowledge of SQL for querying and optimizing large datasets in Redshift
  • Experience in building large-scale data scraping processes for business intelligence purposes
  • Experience working with data analysts to cleanup data anomalies and building dashboard reports
  • Flexibility and comfort with ambiguity
  • Experience scaling systems to support rapid growth
  • Ability to work autonomously and proactively
  • Ability to achieve at least 6 hours of overlap with AEST timezone between 9 AM and 6:30 PM

Responsibilities

  • Develop and maintain scalable data pipelines
  • Integrate data from multiple sources
  • Ensure data architecture supports business intelligence, reporting, and analytics requirements
  • Collaborate with cross-functional teams to build and optimize data infrastructure
  • Provide clean, high-quality data to the business
  • Help build great products for users
  • Ensure product features are reliable, performant, and scalable
  • Solve exciting engineering challenges
  • Deliver improved functionality

Preferred Qualifications

  • Knowledge of Jinja templating in Python
  • Good experience in Airflow and MWAA
  • Experience with DevOps practices for managing infrastructure and CI/CD pipelines (Docker, Kubernetes)
  • 2+ years of professional experience in backend or full-stack software development
  • Ability to develop rich front-end applications in React
  • Experience writing backend services in Elixir/Go/Python/Ruby, with GraphQL and REST APIs
  • Experience building things from zero
  • Strong interest in AI/ML, with an understanding of machine learning pipelines and how data engineering supports AI/ML initiatives
  • Ability to conceive and run experiments, and finetune models

Benefits

  • Remote-friendly, flexible working culture
  • Diverse and inclusive workplace
  • Mentoring and succession planning for your career

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.