Remote Data Engineer

Logo of Hakkoda

Hakkoda

๐Ÿ“Remote - Portugal

Job highlights

Summary

Join Hakkoda, a modern data consultancy, as a Data Engineer and build and optimize data pipelines and architectures within the cloud (AWS, Azure, GCP) for international clients. Collaborate with data experts to create cutting-edge solutions. The ideal candidate has experience in building data warehouses and analytics systems, and a passion for optimizing data processes and pipelines. You will be responsible for expanding and optimizing our clients' data architecture and developing solutions in the Snowflake Data Cloud. We are seeking someone excited to join a fast-growing global company and contribute to the development of next-generation data solutions. Hakkoda offers a remote-first work environment with access to a Lisbon co-working space.

Requirements

  • 2+ years of experience as a Data Engineer or in a related technical role
  • Bachelorโ€™s Degree in Computer Science, Information Systems, Mathematics, MIS, or a related field
  • Experience developing data warehouses and building ETL/ELT ingestion pipelines
  • Strong experience in optimizing โ€˜big dataโ€˜ pipelines, architectures, and data sets
  • Proficiency in SQL scripting and working with relational databases
  • Experience with business intelligence and analytics, particularly working with unstructured data
  • Cloud experience with AWS (experience with Azure and GCP is a plus)
  • Proficiency in Python scripting
  • Strong consulting skills
  • Fluency in English, both written and spoken

Responsibilities

  • Build and optimize data pipelines and architectures within the cloud (AWS, Azure, GCP) for international clients across industries
  • Collaborate with data experts, analysts, architects, and data scientists to create cutting-edge solutions that support business decision-making
  • Expand and optimize our clients' data architecture and develop solutions in the Snowflake Data Cloud
  • Design and maintain data pipeline architectures to ensure the efficient and reliable flow of data across projects
  • Assemble complex data sets
  • Build and optimize ETL/ELT infrastructure
  • Implement scalable data models that support data storage, retrieval, and growth
  • Ensure robust data governance and security practices are applied consistently
  • Collaborate with cross-functional teams to develop solutions that enhance performance and operational efficiency

Preferred Qualifications

Experience with Azure and GCP

Benefits

  • Full access to our cozy co-working space in the heart of Lisbon (Saldanha)
  • Comprehensive health insurance
  • Competitive meal allowance
  • Annual bonus opportunities
  • 22 days of paid time off, plus 2 additional days (your birthday and Christmas Eve)
  • Initial home office budget
  • Work-from-home allowance
  • Technical training and certification programs

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.
Please let Hakkoda know you found this job on JobsCollider. Thanks! ๐Ÿ™