๐Worldwide
Senior Data Engineer

Encora
๐Remote - Mexico
Please let Encora know you found this job on JobsCollider. Thanks! ๐
Summary
Join Encora's Data Engineering team as an experienced Data Engineer with expertise in stream processing technologies like Apache Flink and Kafka Connect. You will design, implement, and maintain scalable data pipelines processing real-time and batch data on GCP. Responsibilities include optimizing data pipelines, building data infrastructure, collaborating with stakeholders, and ensuring data quality. A Bachelor's degree in a related field and 3+ years of experience are required. Strong Apache Flink, Kafka, and SQL skills are essential. The role is full-time and remote.
Requirements
- Bachelorโs degree in Computer Science, Engineering, Mathematics, or a related field
- 3+ years of experience as a Data Engineer
- Strong experience with Apache Flink, including knowledge of the DataStream and Table APIs
- Experience with Kafka and Kafka Connect for data integration
- Proficiency in containerization (Docker) and orchestration (Kubernetes)
- Experience with cloud platforms such as AWS or GCP
- Solid SQL skills, particularly with BigQuery SQL dialect
- Experience with data modeling and schema design
- Proficient in at least one programming language (Java or Python)
- Familiarity with version control systems (Git) and CI/CD pipelines
Responsibilities
- Design, develop, and optimize data pipelines using Apache Flink for stream and batch processing
- Implement and maintain Kafka Connect connectors for seamless data integration
- Build and maintain data infrastructure on Google Cloud Platform (GCP)
- Design and optimize BigQuery tables, views, and stored procedures
- Collaborate with product managers and analysts to understand data requirements
- Ensure data quality, reliability, and proper governance
- Troubleshoot and resolve data pipeline issues
- Document data flows, transformations, and architecture
Preferred Qualifications
- Masterโs degree in Computer Science or related field
- Hands-on experience managing and configuring Kafka Infrastructure
- Experience with other stream processing frameworks (Spark Streaming, Kafka Streams)
- Knowledge of data governance and security best practices
- Proficiency with Google Cloud Platform (GCP) services, including: Google Cloud Storage, Dataflow, Pub/Sub, BigQuery
- Experience with Apache Airflow
- Familiarity with BI tools such as Tableau or Power BI
- Understanding of monitoring tools like Prometheus and Grafana
- Google Cloud Professional Data Engineer certification
Benefits
Remote work
Share this job:
Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.
Similar Remote Jobs
๐Canada
๐ฐ$120k-$180k
๐Worldwide
๐United States
๐ฐ$175k-$210k
๐United States
๐ฐ$175k-$210k
๐United States
๐ฐ$175k-$210k
๐United States
๐ฐ$175k-$210k
๐United States
๐United States
๐ฐ$225k-$255k
๐United States