Data Engineer

IDT BY INDET GROUP Logo

IDT BY INDET GROUP

πŸ“Remote - Chile, Brazil

Summary

Join IDT, a leading telecommunications company, as a Data Engineer and work remotely from LATAM. You will be part of the BI team, focusing on data analysis, ELT/ETL design, and support. Responsibilities include designing and implementing data pipelines, maintaining Snowflake and Denodo solutions, recommending process improvements, and staying updated on emerging technologies. You will partner with stakeholders, create documentation, and ensure data integrity and pipeline reliability. The ideal candidate possesses extensive ETL/ELT experience, strong SQL and PLSQL skills, and proficiency in Python. Familiarity with cloud-based database services and Agile methodologies is also essential.

Requirements

  • 5+ years of experience in ETL/ELT design and development, integrating data from heterogeneous OLTP systems and API solutions, and building scalable data warehouse solutions to support business intelligence and analytics
  • Excellent English communication skills
  • Effective oral and written communication skills with BI team and user community
  • Demonstrated experience in utilizing python for data engineering tasks, including transformation, advanced data manipulation, and large-scale data processing
  • Design and implement event-driven pipelines that leverage messaging and streaming events to trigger ETL workflows and enable scalable, decoupled data architectures
  • Experience in data analysis, root cause analysis and proven problem solving and analytical thinking capabilities
  • Experience designing complex data pipelines extracting data from RDBMS, JSON, API and Flat file sources
  • Demonstrated expertise in SQL and PLSQL programming, with advanced mastery in Business Intelligence and data warehouse methodologies, along with hands-on experience in one or more relational database systems and cloud-based database services such as Oracle, MySQL, Amazon RDS, Snowflake, Amazon Redshift, etc
  • Proven ability to analyze and optimize poorly performing queries and ETL/ELT mappings, providing actionable recommendations for performance tuning
  • Understanding of software engineering principles and skills working on Unix/Linux/Windows Operating systems, and experience with Agile methodologies
  • Proficiency in version control systems, with experience in managing code repositories, branching, merging, and collaborating within a distributed development environment
  • Interest in business operations and comprehensive understanding of how robust BI systems drive corporate profitability by enabling data-driven decision-making and strategic insights

Responsibilities

  • Design, implement, and validate ETL/ELT data pipelines–for batch processing, streaming integrations, and data warehousing, while maintaining comprehensive documentation and testing to ensure reliability and accuracy
  • Maintain end-to-end Snowflake data warehouse deployments and develop Denodo data virtualization solutions
  • Recommend process improvements to increase efficiency and reliability in ELT/ETL development
  • Stay current on emerging data technologies and support pilot projects, ensuring the platform scales seamlessly with growing data volumes
  • Architect, implement and maintain scalable data pipelines that ingest, transform, and deliver data into real-time data warehouse platforms, ensuring data integrity and pipeline reliability
  • Partner with data stakeholders to gather requirements for language-model initiatives and translate into scalable solutions
  • Create and maintain comprehensive documentation for all data processes, workflows and model deployment routines
  • Should be willing to stay informed and learn emerging methodologies in data engineering, and open source technologies

Preferred Qualifications

  • Experience in developing ETL/ELT processes within Snowflake and implementing complex data transformations using built-in functions and SQL capabilities
  • Experience using Pentaho Data Integration (Kettle) / Ab Initio ETL tools for designing, developing, and optimizing data integration workflows
  • Experience designing and implementing cloud-based ETL solutions using Azure Data Factory, DBT, AWS Glue, Lambda and open-source tools
  • Experience with reporting/visualization tools (e.g., Looker) and job scheduler software
  • Experience in Telecom, eCommerce, International Mobile Top-up
  • AWS Solution Architect, AWS Cloud Data Engineer, Snowflake SnowPro Core

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.