Data Engineer

Digibee Logo

Digibee

๐Ÿ“Remote - Brazil

Summary

Join our growing team as a Lead Data Engineer and architect the data backbone for Digibee's business intelligence. You will design, build, and manage Digibeeโ€™s internal data platform, shaping data architecture and developing scalable data solutions. This hands-on role requires deep technical expertise and leadership, leading the end-to-end data product lifecycle and setting best practices. You will guide your engineering partner, deliver impactful data solutions aligning with Digibeeโ€™s strategic objectives, and integrate FinOps principles. The position involves leveraging Google Cloud Platform (GCP) and best-in-class SaaS tools. If you excel in shaping data strategy while coding and driving results, this is an impactful opportunity.

Requirements

  • Proven experience in large-scale data management, including data lake and warehouse architectures, automation, access control, and segregation of data for security and compliance
  • Demonstrated proficiency in Python, Java, or Go and experience with big data processing frameworks like Hadoop, Spark, and Kafka is required
  • Extensive experience with Google Cloud Platform (e.g., Cloud Run, Cloud Functions, Pub/Sub)
  • Strong skills in data modeling, ETL processes, and knowledge of data privacy laws and best practices
  • Ability to forecast and manage costs associated with data tools and services, optimizing for scalability and efficiency
  • Ability to effectively present technical concepts and project impact to both technical and non-technical stakeholders

Responsibilities

  • Define and implement data management strategies, building robust data lakes, warehouses, and pipelines that support data-driven decisions
  • Ensure compliance with data security standards such as GDPR, LGPD, HIPAA, and CCPA
  • Develop and optimize end-to-end ELT/ETL pipelines that connect various data sources (databases, APIs) to user interfaces and analytics platforms
  • Utilize Google Cloud services, including Cloud Storage, BigQuery, and Pub/Sub
  • Forecast and manage data-related costs for tools, storage, and processing
  • Post-implementation, oversee data ingestion, tool evaluation, and ongoing maintenance for continuous improvements
  • Recommend and utilize visualization tools to deliver actionable insights and uncover trends across datasets
  • Communicate findings and project value effectively to stakeholders
  • Lead data engineering teams, mentoring junior engineers, and championing best practices in software engineering, compliance, and cost-effective data management

Preferred Qualifications

  • Experience with visualization tools (e.g., Tableau, Power BI, Looker Studio) and orchestration frameworks (e.g., Apache Airflow, Dagster)
  • Familiarity with Terraform or OpenTofu for infrastructure management and GitLab CI for continuous integration

Benefits

  • We're remote first, with a flexible working schedule
  • Health care
  • R$ 1.200,00/month on Caju card (for food and meal allowance, mobility, home office supplies, culture, health, and education)
  • Life insurance
  • Child care assistance
  • Gympass
  • English course: we have a partnership for group classes for R$100 monthly

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.