Summary
Join Encora as a Data Engineer in Brazil and work remotely, playing a crucial role in data modeling, transformation, and pipeline development. Collaborate with customers, including BI and software engineering teams, to integrate and maintain high-quality data. Drive the adoption of cloud data platform capabilities using tools like Google Cloud Platform (BigQuery), Fivetran, and dbt. Occasionally develop dashboards in Looker. This role involves creating and maintaining data applications and machine learning models, deploying modern data solutions, and ensuring data accessibility.
Requirements
- Python programming for data manipulation and analysis
- ETL concepts and tools
- Software Development background
- Familiarity with data exploration tools
- Hands-on experience in data engineering, preferably working with a modern cloud technologies (e.g. Snowflake, dbt, Fivetran, Databricks)
Responsibilities
- Responsible for creating and maintaining data applications and machine learning models
- Deploy modern data solutions that interact with Generative AI and build APIs for delivering models and providing data access
- Ensure that data is received, verified, transformed, stored and that it is accessible to the team
- Build and promote data pipeline management and database maintenance
- Gather and prepare data to be analyzed and worked on, correcting possible processing failures
- Consume and transform data obtained from different sources (CSV, APIs, β¦)
- Add value by building, structuring and optimizing data to be used by analysts and users
Preferred Qualifications
- Experience with extracting, processing and modeling data
- Experience in creating scripts for querying and manipulating data in SQL database
- Experience in developing ETL routines (Data Extraction, Transformation and Loading)
- Experience in AWS Tools (AWS Glue, AWS Lambda, AWS Athena, AWS S3, AWS EC2)
- Hands-on experience in data engineering, preferably working with a modern cloud technologies (e.g. Snowflake, dbt, Fivetran, Databricks)
- Experience with streaming data and workflow orchestration i.e,. Kafka, Spark, Airflow, etc
- Experience with creating and maintaining CI/CD pipelines (CircleCI, GitHub Actions)
Benefits
Work from home
Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.