πFrance, Spain
Senior Data Engineer

Yassir
π΅ $50k-$120k
πRemote - Egypt, South Africa
Please let Yassir know you found this job on JobsCollider. Thanks! π
Summary
Join Yassir, a leading super app in the Maghreb region expanding into France, Canada, and Sub-Saharan Africa, and help build a centralized data lake on GCP. We offer on-demand services and are introducing financial services. You will build and maintain data pipelines, ensure data quality, collaborate with data science and other teams, and design data dashboards. The role requires expertise in PySpark, GCP data services, SQL, NoSQL, and data modeling. Yassir values diversity and inclusion.
Requirements
- PySpark
- GCP - Big Query, Dataproc, Dataflow, Dataplex, Pub-Sub and Cloud Storage
- Advance SQL knowledge
- NoSQL (Preferably MongoDB)
- Programming languages - Scala/Python
- Great Expectation - similar DQ framework
- Familiarity with workflow management tools like Airflow, Prefect or Luigi
- Understanding of Data Governance, DWH and Data Modelling
Responsibilities
- Build a centralized data lake on GCP Data services by integrating diverse data sources throughout the enterprise
- Develop, maintain, and optimize SPARK-powered batch and streaming data processing pipelines. Leverage GCP data services for complex data engineering tasks and ensure smooth integration with other platform components
- Design and implement data validation and quality checks to ensure data's accuracy, completeness, and consistency as it flows through the pipelines
- Work with the Data Science and Machine Learning teams to engage in advanced analytics
- Collaborate with cross-functional teams, including data analysts, business users, operational and marketing teams, to extract insights and value from data
- Collaborate with the product team to design, implement, and maintain the data models for analytical use cases
- Design, develop, and upkeep data dashboards for various teams using Looker Studio
- Engage in technology explorations, research and development, POCβs and conduct deep investigations and troubleshooting
- Design and manage ETL/ELT processes, ensuring data integrity, availability, and performance
- Troubleshoot data issues and conduct root cause analysis when reporting data is in question
Preferred Qualifications
- Infrastructure as Code - Terraform
- Docker and Kubernetes
- Looker Studio
- AI and ML engineering knowledge
Share this job:
Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.
Similar Remote Jobs
π°$120k-$180k
πWorldwide
πBrazil
πIndia
π°$175k-$210k
πUnited States
π°$225k-$255k
πUnited States
π°$170k-$180k
πUnited States
πArgentina
πIndia
π°$140k-$200k
πUnited States