Senior Data Engineer

Pismo Logo

Pismo

πŸ“Remote - Brazil

Summary

Join Pismo's Data Team and play a central role in providing data-intensive analytics and processing terabytes of high-quality data for internal and external use. Operating globally, the team is expanding, and we seek talented engineers to contribute to our data platform. You will design and develop robust, scalable data pipelines using modern Big Data frameworks, drive platform improvements, and take ownership of production operations. Collaboration with cross-functional teams is key, as is mentoring peers and ensuring adherence to best practices. You will also participate in on-call rotations and conduct root-cause analyses to enhance efficiency and reliability. This role offers the opportunity to influence multiple teams and promote a culture of technical excellence.

Requirements

  • B1 English or Above (Intermediate)
  • Strong experience as a Data Engineer
  • Good experience with AWS services (e.g., S3, DMS, Lambda, Kinesis, and IAM)
  • Apache Spark experience or similar tools (e.g., Flink, Hadoop)
  • Python experience
  • Architectural design experience
  • Git/GitHub experience
  • Monitoring tools: Grafana and CloudWatch (logs/tracing/spans/monitoring/alerts/dashboards)
  • SQL experience

Responsibilities

  • Serve as a senior technical contributor on Pismo’s data stack, leading the design and development of robust, scalable data pipelines using modern Big Data frameworks
  • Drive continuous improvement in platform performance, reliability, and cost-effectiveness, proactively identifying opportunities for innovation
  • Take ownership of production operations and monitoring (file-based systems), ensuring adherence to SLAs and applying best practices for observability and resilience
  • Collaborate with cross-functional teams to architect, build, and evolve high-complexity data features, reducing technical debt and championing engineering excellence
  • Provide mentorship to peers through code reviews, feedback, and support, fostering an environment of knowledge sharing and professional development
  • Ensure that data pipelines follow the best testing, monitoring, observability, and security practices
  • Participate in on-call rotations, promptly addressing and resolving production incidents, while proactively implementing preventative measures
  • Build and optimize ETL/ELT processes to handle large-scale data, leveraging advanced data processing and distributed systems techniques
  • Maintain rigorous data governance and platform standards, working closely with data stakeholders to ensure consistent, high-quality data assets
  • Conduct root-cause analyses for critical issues, challenging and refining team processes to enhance efficiency and reliability
  • Influence multiple teams by promoting a culture of technical excellence, forward-thinking solutions, and continuous improvement

Preferred Qualifications

  • CI/CD experience
  • Airflow experience
  • CDC (Change data capture) experience

Benefits

  • Remote work
  • Flexible hours
  • Meal & Food vouchers
  • Remote work financial support
  • Life Insurance
  • Medical and Dental
  • Assistance Employee child care benefit: daycare
  • Private Pension (2x1)
  • Vidalink partnership
  • Support for studying languages
  • Incentive for AWS and GCP certifications
  • Sesc Partnership
  • Performance Incentive Plan

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.

Similar Remote Jobs