πPoland
Big Data Engineer

Capco
πRemote - Poland
Please let Capco know you found this job on JobsCollider. Thanks! π
Summary
Join Capco Poland, a global technology and management consultancy, as a Data Engineer. You will work on engaging projects with major banks, developing and enhancing financial and data solutions using Google Cloud Platform (GCP). Responsibilities include designing, implementing, and managing scalable data pipelines, building data workflows, collaborating with cross-functional teams, and ensuring data quality and security. This role requires hands-on GCP experience, strong Python programming skills, and experience with SQL and NoSQL databases. Capco offers various benefits, including remote work options, multiple employee benefit packages, access to online courses, and ongoing learning opportunities.
Requirements
- Hands-on experience with Google Cloud Platform (GCP), including BigQuery, GCS, and DataProc
- Strong proficiency in Python programming for data engineering tasks
- Proven experience working with SQL and NoSQL databases
- Solid understanding of building and maintaining data pipelines using APIs
- Knowledge of data pipeline orchestration and optimization techniques
- Experience working with large-scale data sets and cloud-based data storage systems
- Strong problem-solving abilities and a keen attention to detail
Responsibilities
- Design, implement, and manage scalable data pipelines using Google Cloud Platform (GCP) services, such as BigQuery, Google Cloud Storage (GCS), and DataProc
- Build and optimize data workflows using SQL, NoSQL databases and APIs
- Collaborate with cross-functional teams to gather requirements and ensure that data infrastructure supports the broader business objectives
- Write efficient, maintainable code in Python for data processing and integration tasks
- Troubleshoot and resolve issues related to data pipelines, databases, and cloud services
- Ensure data quality, reliability, and security throughout the data pipeline lifecycle
- Actively contribute to the design of new systems and maintain or upgrade existing data infrastructure
- Contribute to security designs and have advanced knowledge of key security technologies e.g. TLS, OAuth, Encryption
- Support internal Capco capabilities by sharing insight, experience
Preferred Qualifications
- Knowledge of PySpark for large-scale data processing
- Familiarity with data engineering best practices and emerging technologies
Benefits
- Employment contract and/or Business to Business - whichever you prefer
- Possibility to work remotely
- Multiple employee benefits packages (MyBenefit Cafeteria, private medical care, life-insurance)
- Access to 3.000+ Business Courses Platform (Udemy)
- Access to required IT equipment
- Paid Referral Program
- Participation in charity events e.g. Szlachetna Paczka
- Ongoing learning opportunities to help you acquire new skills or deepen existing expertise
- Being part of the core squad focused on the growth of the Polish business unit
- A flat, non-hierarchical structure that will enable you to work with senior partners and directly with clients
- A work culture focused on innovation and creating lasting value for our clients and employees
Share this job:
Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.
Similar Remote Jobs
πPoland
πPoland
πSpain
π°$116k-$198k
πWorldwide
πIsrael
πTaiwan, Bahrain

πIreland
π°$182k-$249k
πUnited States