πPoland
Big Data Engineer

Capco
πRemote - Poland
Please let Capco know you found this job on JobsCollider. Thanks! π
Summary
Join Capco Poland as a Senior Big Data Engineer in KrakΓ³w, enjoying a hybrid work model. You will design, implement, and maintain scalable data pipelines and solutions using GCP services for large-scale data processing and analytics. Work on engaging projects with major banks, contributing to digital transformation through a continuous delivery model. Collaborate with cross-functional teams, automate data engineering processes, and ensure data quality and security. This role offers opportunities to work with diverse technologies, learn new skills, and contribute to a growing business unit within a supportive and innovative work culture.
Requirements
- Hands-on experience with Google Cloud Platform (GCP), including BigQuery, GCS, and DataProc
- Strong proficiency in Python programming for data engineering tasks
- Proven experience working with SQL and NoSQL databases
- Solid understanding of building and maintaining data pipelines using APIs
- Knowledge of data pipeline orchestration and optimization techniques
- Experience working with large-scale data sets and cloud-based data storage systems
- Strong problem-solving abilities and a keen attention to detail
Responsibilities
- Design, implement, and manage scalable data pipelines using Google Cloud Platform (GCP) services, such as BigQuery, Google Cloud Storage (GCS), and DataProc
- Build and optimize data workflows using SQL, NoSQL databases and APIs
- Collaborate with cross-functional teams to gather requirements and ensure that data infrastructure supports the broader business objectives
- Write efficient, maintainable code in Python for data processing and integration tasks
- Troubleshoot and resolve issues related to data pipelines, databases, and cloud services
- Ensure data quality, reliability, and security throughout the data pipeline lifecycle
- Actively contribute to the design of new systems and maintain or upgrade existing data infrastructure
- Contribute to security designs and have advanced knowledge of key security technologies e.g. TLS, OAuth, Encryption
- Support internal Capco capabilities by sharing insight, experience
Preferred Qualifications
- Knowledge of PySpark for large-scale data processing
- Familiarity with data engineering best practices and emerging technologies
Benefits
- Employment contract and/or Business to Business - whichever you prefer
- Possibility to work remotely
- Multiple employee benefits packages (MyBenefit Cafeteria, private medical care, life-insurance)
- Access to 3.000+ Business Courses Platform (Udemy)
- Access to required IT equipment
- Paid Referral Program
- Participation in charity events e.g. Szlachetna Paczka
- Ongoing learning opportunities to help you acquire new skills or deepen existing expertise
Share this job:
Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.
Similar Remote Jobs
πPoland
πPoland
πSpain
π°$116k-$198k
πWorldwide
πIsrael
πTaiwan, Bahrain

πIreland
π°$182k-$249k
πUnited States