Senior Data Engineer

DaCodes Logo

DaCodes

πŸ“Remote - Mexico

Summary

Join DaCodes, a leading software development and digital transformation company, as a Senior Data Engineer! You will design, build, and optimize data pipelines for large-scale applications. This role requires expertise in big data, real-time processing, and data lakes, utilizing cloud platforms and various technologies. Collaborate with data scientists, analysts, and engineers to deliver high-quality data solutions. We offer a fast-paced, agile environment with opportunities for professional growth and collaboration with global brands and startups. Enjoy remote work flexibility and a comprehensive benefits package.

Requirements

  • 5+ years of experience in data engineering, data architecture, or backend development
  • Strong expertise in SQL and NoSQL databases (PostgreSQL, MySQL, MongoDB, DynamoDB, etc.)
  • Cloud expertise with AWS (preferred), GCP, or Azure
  • Proficiency in Python, Java, or Scala for data processing and pipeline development
  • Experience with big data frameworks like Apache Spark, Hadoop, or Flink
  • Hands-on experience with ETL/ELT processes and data pipeline orchestration tools (Apache Airflow, dbt, Luigi, or Prefect)
  • Experience with message queues and streaming technologies (Kafka, Kinesis, Pub/Sub, or RabbitMQ)
  • Knowledge of containerization and orchestration tools (Docker, Kubernetes)
  • Strong problem-solving skills and the ability to optimize performance and scalability
  • English proficiency (B2 or higher) to collaborate with international teams

Responsibilities

  • Design, develop, and maintain scalable and efficient data pipelines for batch and real-time processing
  • Build and optimize data lakes, warehouses, and analytics solutions on cloud platforms (AWS, GCP, or Azure)
  • Implement ETL/ELT workflows using tools such as Apache Airflow, dbt, or Prefect
  • Ensure data integrity, consistency, and governance through proper architecture and best practices
  • Integrate data from various sources (structured and unstructured), including APIs, streaming services, and databases
  • Work with data scientists and analysts to ensure high availability and accessibility of data for analytics and machine learning models
  • Monitor, troubleshoot, and improve the performance of data pipelines
  • Implement security best practices for data access, encryption, and compliance
  • Collaborate with software engineers to integrate data pipelines into applications and services
  • Stay up to date with the latest trends in big data, cloud technologies, and data engineering best practices

Preferred Qualifications

  • Experience with data lakehouse architectures (Delta Lake, Iceberg, Hudi)
  • Familiarity with Machine Learning (ML) and AI-related data workflows
  • Experience with Infrastructure as Code (Terraform, CloudFormation) for managing data environments
  • Knowledge of data security and compliance regulations (GDPR, CCPA, HIPAA)

Benefits

  • Remote work/Home office
  • Work schedule aligned with your assigned team/project. (Client's time zone)
  • Monday to Friday work week
  • Legal benefits
  • Official holidays according to your assigned team/project
  • Vacation days
  • Day off on your birthday
  • Major medical insurance
  • Life insurance
  • Virtual integration events and interest groups
  • Meetups with special guests from companies, IT professionals, and prestigious universities
  • Constant feedback and performance tracking
  • Access to courses and certifications
  • Multicultural work teams
  • English classes
  • Opportunities across our different business lines

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.