πPortugal
Data Engineer Senior - Apache Flink

Encora
πRemote - Brazil
Please let Encora know you found this job on JobsCollider. Thanks! π
Summary
Join Encora as a Data Engineer and work remotely from Brazil, focusing on stream processing technologies like Apache Flink and Kafka Connect. You will design, implement, and maintain scalable data pipelines processing real-time and batch data. This role involves building data infrastructure on GCP, optimizing BigQuery, collaborating with product teams, and ensuring data quality. Experience with Apache Flink, Kafka, GCP, SQL, and data modeling is essential. The position offers a full-time, work-from-home arrangement.
Requirements
- Have experience as a Data Engineer
- Have strong experience with Apache Flink, including knowledge of the DataStream and Table APIs
- Have experience with Kafka and Kafka Connect for data integration
- Have experience with containerization (Docker) and orchestration (Kubernetes)
- Have experience in Cloud Infrastructure platforms such as AWS, GCP
- Have solid SQL skills with specific experience in BigQuery SQL dialect
- Have experience with data modeling and schema design
- Be proficient in at least one programming language (Java or Python)
- Have experience with version control systems (Git) and CI/CD pipelines
Responsibilities
- Design, develop, and optimize data pipelines using Apache Flink for stream and batch processing
- Implement and maintain Kafka Connect connectors for seamless data integration
- Build and maintain data infrastructure on Google Cloud Platform (GCP)
- Design and optimize BigQuery tables, views, and stored procedures
- Collaborate with product managers and analysts to understand data requirements
- Ensure data quality, reliability, and proper governance
- Troubleshoot and resolve data pipeline issues
- Document data flows, transformations, and architecture
Preferred Qualifications
- Have knowledge and hands-on experience with Kafka Infrastructure (i.e. managing and configuring Kafka components)
- Have experience with other stream processing frameworks (Spark Streaming, Kafka Streams)
- Have knowledge of data governance and security best practices
- Have proficiency with Google Cloud Platform services, particularly: Google Cloud Storage
- Have proficiency with Google Cloud Platform services, particularly: Dataflow
- Have proficiency with Google Cloud Platform services, particularly: Pub/Sub
- Have proficiency with Google Cloud Platform services, particularly: BigQuery
- Have experience with Apache Airflow
- Have experience with BI tools such as Tableau or PowerBI
- Have familiarity with monitoring tools like Prometheus, Grafana
- Have certifications such as Google Cloud Professional Data Engineer
Benefits
Work from home
Share this job:
Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.
Similar Remote Jobs
π°$175k-$205k
πUnited States
πBulgaria
πCyprus
πSpain
πPortugal
πWorldwide

πGermany

π°$150k-$190k
πUnited States

π°$167k
πUnited States