Senior Python Developer

Encora Logo

Encora

πŸ“Remote - Argentina, Bolivia

Summary

Join Encora as a Senior Python Developer and contribute to high-impact, scalable data solutions in cloud-native environments. You will design, build, and maintain robust data pipelines, implement automated testing, collaborate with business teams on visualizations, and integrate modern AI tools. This role requires strong Python, SQL, and data modeling skills, along with experience in AWS and data visualization tools. The position is remote and full-time, based in Colombia, Peru, Argentina, Costa Rica, or Bolivia. You will work within a cross-functional team, driving ingestion, transformation, modeling, and visualization efforts.

Requirements

  • 5+ years of hands-on experience with Python for data engineering and automation
  • Strong proficiency in SQL and data modeling techniques
  • Proven experience designing and implementing data pipelines in production environments
  • Solid understanding of data validation, testing frameworks, and error handling strategies
  • Deep familiarity with the AWS ecosystem, including services like S3, ECS, CloudWatch, and Airflow (MWAA)
  • Experience creating business-facing data visualizations using Plotly and Streamlit
  • Proven track record working with AI system integrations, including Langchain, RAG, vector stores, N8N, MCP Servers, AWS SageMaker, and other tools for AI model orchestration and monitoring
  • Comfortable working independently and collaboratively in a remote, distributed team

Responsibilities

  • Design, build, and maintain robust data pipelines for ingestion, transformation, and modeling
  • Implement automated testing and error handling to ensure pipeline reliability and data quality
  • Collaborate directly with business teams to develop visualizations using Plotly and Streamlit
  • Operate within a cloud-based architecture (AWS), working with services such as S3, ECS, CloudWatch, and Managed Workflows for Apache Airflow (MWAA)
  • Deploy and manage infrastructure using tools like Docker, Terraform, and Azure DevOps
  • Integrate and monitor modern AI solutions using tools and frameworks such as RAG, vector stores, Langchain, N8N, MCP Servers, and AWS SageMaker

Preferred Qualifications

Experience with Airflow, Snowflake, Terraform, Docker, MongoDB, Elastic, PostgreSQL, and Azure DevOps

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.