Senior Data Engineer

Logo of Capco

Capco

πŸ“Remote - Poland

Job highlights

Summary

Join Capco Poland, a global technology and management consultancy, as a Senior Big Data Engineer. This hybrid role (2 days/week in Krakow) involves designing, implementing, and maintaining scalable data pipelines and solutions. You will collaborate with cross-functional teams, optimize Spark jobs, ensure data quality, and stay updated on industry best practices. The ideal candidate possesses at least 5 years of experience as a Data Engineer/Big Data Engineer, excellent SQL and Python skills, and experience with Spark and Hadoop. Capco offers various benefits, including remote work possibilities, multiple employee benefit packages, access to online courses, and a supportive work culture.

Requirements

  • Minimum 5 years of experience as a Data Engineer/Big Data Engineer
  • University degree in computer science, mathematics, natural sciences, or similar field and relevant working experience
  • Excellent SQL skills, including advanced concepts
  • Very good programming skills in Python
  • Experience in Spark and Hadoop
  • Experience in OOP
  • Experience using agile frameworks like Scrum
  • Interest in financial services and markets
  • Fluent English communication and presentation skills
  • Sense of humor and positive attitude

Responsibilities

  • Design, develop and maintain robust data pipelines using Python, Spark, Hadoop, SQL for batch and streaming data processing
  • Collaborate with cross-functional teams to understand data requirements and design efficient solutions that meet business needs
  • Optimize Spark jobs and data processing workflows for performance, scalability and reliability
  • Ensure data quality, integrity and security throughout the data lifecycle
  • Troubleshoot and resolve data pipeline issues in a timely manner to minimize downtime and impact on business operations
  • Stay updated on industry best practices, emerging technologies, and trends in big data processing and analytics
  • Document, design specifications, deployment procedures and operational guidelines for data pipelines and systems
  • Provide technical guidance and mentorship for new joiners

Preferred Qualifications

  • Experience or knowledge with GCP
  • Experience with GCP, Pub/Sub, Big Query, Kafka, Juniper, Apache NiFi, Hive, Impala, Cloudera, CI/CD

Benefits

  • Employment contract and/or Business to Business - whichever you prefer
  • Possibility to work remotely
  • Multiple employee benefits packages (MyBenefit Cafeteria, private medical care, life-insurance)
  • Access to 3.000+ Business Courses Platform (Udemy)
  • Access to required IT equipment
  • Paid Referral Program
  • Participation in charity events e.g. Szlachetna Paczka
  • Ongoing learning opportunities to help you acquire new skills or deepen existing expertise

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.
Please let Capco know you found this job on JobsCollider. Thanks! πŸ™