Remote Data Platform Engineer Specialist
QuintoAndar
πRemote - Brazil
Please let QuintoAndar know you found this job on JobsCollider. Thanks! π
Job highlights
Summary
Join our team at QuintoAndar as a Data Platform Engineer Specialist and contribute to building a high-performance data platform that meets the company's needs, connects with product solutions, and leads analytical innovation.
Requirements
- Specialist in technologies, solutions, and concepts of Big Data (Spark, Hadoop, Hive, MapReduce) and multiple languages (YAML, Python)
- Proficiency in Python, or one of the major programming languages, and a passion for writing clean and maintainable code
- Understanding of the data lifecycle and concepts such as lineage, governance, privacy, retention, anonymization, etc
- Excellent communication skills, proactively sharing and seeking context to collaborate with different teams
Responsibilities
- Build and maintain a high-performance data platform that meets the company's needs, connects with product solutions, and leads analytical innovation, enabling amazing architectures and efficient platforms
- Responsible for the entire lifecycle of code development (monitoring deployment, adding metrics and alarms, ensuring SLO budget, and more)
- Align with stakeholders to understand their primary needs, but also have a holistic view of the problem and propose extensible, scalable, and incremental solutions
- Conduct PoCs and benchmarks to determine the best tool for a given problem, as well as understand whether to use market solutions or develop internally
- Contribute to defining the strategic vision, crossing team and service boundaries to solve problems
- Be a reference within the chapter on technical concepts, tools, and/or best coding practices
Preferred Qualifications
- You are a specialist in building large-scale data platforms for large datasets and teams, using Big Data technologies such as Spark, Kafka, Debezium, Trino, Hive, Atlas, Ranger, etc
- You are fluent in AWS and GCP cloud features
- You have strong experience with Apache Airflow or other tools for data workflows
- You have strong experience with columnar storage solutions and/or data lakehouse concepts
- You have knowledge in the area of infrastructure such as containers and orchestration (Kubernetes, ECS), CI/CD strategies, infrastructure as code (Terraform), observability (Prometheus, Grafana), among others
Benefits
- Competitive salary package
- Bonus
- Meal allowance ("Flash benefΓcios")
- Health plan
- Dental plan (optional)
- Life insurance
- Daycare subsidy
- Subsidy to sports practicing (Gympass)
- Extended maternity and paternity leave
- Reserved room for breast-feeding*
- Discount on our parking lot;*
- Language learning support
- Free transfer from Vila Madalena and Fradique Coutinho stations to the office*
- Free bike rack in our parking lot.*
Share this job:
Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.
Similar Remote Jobs
- πWorldwide
- πUnited States
- πUnited States
- π°$139k-$247kπUnited States
- πPeru, Spain
- πFinland
- π°$92k-$103kπSpain
- π°$90k-$105kπPoland
- π°$95k-$110kπUnited States
Please let QuintoAndar know you found this job on JobsCollider. Thanks! π