Data Lake Architect

Hakkoda Logo

Hakkoda

📍Remote - Costa Rica

Summary

Join Hakkoda, an IBM Company, as an AWS Managed Services Architect and play a pivotal role in architecting and optimizing the infrastructure and operations of complex Data Lake environments for our clients. Leverage your AWS expertise to design, implement, and maintain scalable and secure data solutions. Collaborate with delivery teams across various regions, ensuring a robust Data Lake architecture. Proactively engage with clients to support their evolving needs and guide teams toward innovative solutions. This hands-on role requires designing solutions, troubleshooting, optimizing performance, and maintaining operational excellence. We offer a fast-paced, dynamic environment where your input and efforts are valued. We are looking for curious and creative individuals who want to be part of a collaborative atmosphere that values learning, growth, and hard work.

Requirements

  • 7+ years of hands-on experience in cloud architecture and infrastructure (preferably AWS)
  • 3+ years of experience specifically architecting and managing Data Lake or big data solutions on AWS
  • Expertise in AWS services such as EMR, Batch, SageMaker, Glue, Lambda, IAM, IoT TimeStream, DynamoDB, and more
  • Strong programming skills in Python for scripting and automation
  • Proficiency in SQL and performance tuning for data pipelines and queries
  • Experience with IaC tools like Terraform or Knowledge of big data frameworks such as Apache Spark, Hadoop, or similar
  • Proven ability to design and implement secure solutions, with strong knowledge of IAM policies and compliance standards
  • An analytical and problem-solving mindset to resolve complex technical challenges
  • Exceptional communication skills to engage with technical and non-technical stakeholders
  • Ability to lead cross-functional teams and provide mentorship
  • Bachelor’s Degree (BA/BS) in Computer Science, Information Systems, or a related field
  • AWS Certifications, such as Solutions Architect Professional or Big Data Specialty

Responsibilities

  • AWS Data Lake Architecture: Design, build, and support scalable, high-performance architectures for complex AWS Data Lake solutions
  • Service Expertise: Deploy and manage solutions using AWS services, including but not limited to: EMR (Elastic MapReduce): Optimize and maintain EMR clusters for big data processing. AWS Batch: Design workflows to execute batch processing workloads effectively. SageMaker: Support data science teams with scalable model training and deployment. Glue: Implement Glue jobs for ETL/ELT processes to ensure efficient data ingestion and transformation. Lambda: Develop serverless solutions to automate processes and manage events. IAM Policies: Define and enforce security policies to control resource access and maintain governance. IoT TimeStream Database: Design solutions to handle time-series data at scale. DynamoDB: Build and optimize scalable NoSQL database solutions
  • Data Governance & Security: Enforce compliance, governance, and security best practices, ensuring data protection and privacy throughout the architecture
  • Performance Optimization: Monitor and fine-tune performance across AWS resources to ensure cost-effective and efficient operations
  • Automation: Develop Infrastructure as Code (IaC) solutions using tools like AWS CloudFormation, Terraform, or similar
  • Client Collaboration: Work closely with clients to understand their business goals and ensure the architecture aligns with their needs
  • Team Leadership: Act as a technical mentor for delivery teams and provide support in troubleshooting, design reviews, and strategy discussions
  • Innovation: Stay updated on AWS advancements, best practices, and emerging tools to incorporate into solutions
  • Documentation: Develop and maintain architecture diagrams, SOPs, and knowledge-sharing materials for internal and client-facing purposes

Preferred Qualifications

  • Experience with Snowflake, Matillion, or Fivetran in hybrid cloud
  • Familiarity with Azure or GCP cloud platforms
  • Understanding of machine learning pipelines and workflows

Benefits

  • Comprehensive Life Insurance: Including dental and vision, wellness, home spa treatments, express doctor visits etc
  • Paid Parental Leave
  • Flexible PTO Options
  • Company Bonus Program
  • Asociación Solidarista
  • Technical Training & Certifications
  • Extensive Learning and Development Opportunities
  • Flexible Work-from-Home Policy
  • Work from Anywhere Benefit

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.