Backend Developer

Logo of Encora

Encora

πŸ“Remote - Mexico

Job highlights

Summary

Join Encora's dynamic team as a highly skilled Backend Developer. You will leverage your expertise in Python, SQL, and data pipeline construction to build scalable backend solutions and data infrastructure on AWS. This role requires 7+ years of experience in backend development with a focus on data engineering and cloud technologies. You will collaborate with data scientists and business stakeholders to create and optimize data visualizations. The position offers a work-from-home arrangement and the opportunity to work with cutting-edge technologies.

Requirements

  • 7+ years of experience in backend development, with a focus on data engineering, data pipelines, and cloud technologies
  • Proficient in Python, with extensive experience in writing clean, maintainable, and efficient code
  • Strong SQL skills, with experience in writing complex queries for large datasets and optimizing them for performance
  • Hands-on experience in building, maintaining, and optimizing data pipelines. Familiarity with ETL processes and tools is essential
  • Experience in ingesting, transforming, and modeling data for both transactional and analytical use cases
  • Solid experience in implementing unit tests, integration tests, and handling errors gracefully in production environments
  • Hands-on experience with AWS services like S3, ECS, CloudWatch, Lambda, and MWAA (Managed Airflow). Proficiency in navigating AWS console/CLI and troubleshooting via CloudWatch logs
  • Experience in Docker for containerized applications and services
  • Ability to work with data scientists and business teams to create interactive dashboards and data visualizations using Plotly and Streamlit
  • Familiarity with Git and continuous integration/deployment (CI/CD) pipelines

Responsibilities

  • Design, develop, and maintain backend systems, focusing on performance, scalability, and reliability using Python and related technologies
  • Build, optimize, and maintain robust data pipelines for ingesting, processing, and transforming data in cloud environments. This includes working with both batch and real-time data flows
  • Design and implement efficient data models for large datasets. Ensure data is ingested in a way that supports downstream analytics and reporting, including the integration of AWS services (e.g., S3, Redshift, Glue)
  • Write and maintain unit tests and implement proper error handling to ensure the stability and reliability of backend systems
  • Work with AWS services and tools to manage and deploy backend applications and data pipelines. This includes using AWS S3, ECS, CloudWatch, Lambda, and managed MWAA (Managed Workflows for Apache Airflow) instances
  • Collaborate with data scientists, analysts, and business stakeholders to create and optimize visualizations with Plotly and Streamlit, helping the business interpret data effectively
  • Identify performance bottlenecks in backend systems and implement appropriate solutions to improve efficiency and scalability
  • Maintain comprehensive documentation on the design, deployment, and maintenance of backend systems and data pipelines

Preferred Qualifications

  • Experience working with Apache Airflow, particularly in a managed AWS MWAA instance, to orchestrate workflows
  • Experience with Snowflake, including data modeling, querying, and performance tuning
  • Familiarity with Terraform to manage cloud infrastructure
  • Familiarity with databases like MongoDB, ElasticSearch, PostgreSQL, or similar NoSQL/SQL solutions
  • Familiarity with Azure DevOps for version control, pipeline management, and continuous integration

Benefits

Work from home

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.