Senior Consultant Big Data and Cloud

Capco
Summary
Join Capco, a global technology and management consulting firm, as a Senior Consultant specializing in Big Data platforms and Cloud environments. This impactful role involves designing, developing, and maintaining Big Data solutions, ensuring data quality and efficient processing. You will collaborate with data, engineering, and business teams to deliver client-focused solutions, applying best practices in data engineering, security, and governance. The position requires 6-8 years of experience in data engineering with a strong focus on Big Data projects and expertise in tools like Hadoop, Spark, Hive, and Kafka. Fluency in English is essential for daily collaboration with multicultural teams. Capco offers an inclusive work environment where individuality is celebrated.
Requirements
- 6 to 8 years of experience in data engineering, with a strong focus on Big Data projects
- Solid experience with tools and frameworks such as: Hadoop, Spark, Hive, Kafka, etc
- Experience with Cloud platforms (preferably AWS, Azure, or GCP)
- Knowledge of data pipelines, ETL/ELT, and distributed data models
- Ability to work collaboratively in agile and multidisciplinary environments
- Advanced English for daily collaboration with multicultural teams
Responsibilities
- Design, develop, and maintain Big Data solutions, ensuring efficient processing and data quality
- Work on the architecture, ingestion, transformation, and provisioning of data on modern platforms (on-premise and/or cloud)
- Collaborate with data, engineering, and business teams to ensure the delivery of solutions that meet client needs
- Promote and apply best practices in data engineering, security, and information governance
- Support initiatives to migrate or modernize data environments to the cloud
Preferred Qualifications
- Experience with Data Lake, Delta Lake, and event-driven architectures
- Cloud certifications (e.g., AWS Certified Data Analytics, Azure Data Engineer, etc.)
- Knowledge of languages such as Python, Scala, or Java for data processing
- Familiarity with orchestration tools such as Airflow, Apache Nifi, DBT, or similar
- Understanding of best practices in data governance, security, and compliance
Share this job:
Similar Remote Jobs
