DataOps Engineer
Alter Solutions Portugal
πRemote - Portugal
Please let Alter Solutions Portugal know you found this job on JobsCollider. Thanks! π
Job highlights
Summary
Join our team as a Data Engineer and play a crucial role in maintaining and optimizing our data infrastructure. You will be responsible for incident response, infrastructure management, database optimization, and deployments. This role requires extensive experience with GCP, Airflow, BigQuery, and Terraform. You will collaborate with Data Architects and Engineers, ensuring efficient and reliable data flows. The ideal candidate possesses strong problem-solving skills and a proactive approach to identifying and resolving issues. We offer a challenging and rewarding environment for experienced data professionals.
Requirements
- General knowledge of the Google Cloud Platform and various services, and at least one year of experience with GCP
- At least two years of experience with the Airflow orchestrator
- Extensive experience (at least 4 years) with Google BigQuery, know how to optimize tables and queries, and able to design database architecture
- At least two years of experience with Terraform, and know good practices of GitOps
Responsibilities
- Understand problems from a user perspective and communicate to clearly understand the issue
- Reproduce bugs or issues that users are facing
- Apply root cause analysis to quickly and efficiently find the root cause of the problem, patch it, test it, and communicate with the end user
- Write postmortems summarizing every step of resolution and helping the team to track all issues
- Monitor existing flows and infrastructure and perform the same tasks when discovering bugs/issues through monitoring and alerting
- Monitor flows and infrastructure to identify potential issues
- Adapt configurations to keep flows and infrastructures working as expected, keeping the operations without incident
- Track costs and time of processing through dedicated dashboards
- Alert people who query tables the wrong way, involving high costs
- Track down jobs, views, and tables that are running inefficiently and occur either high costs or low speed of execution
- Optimize jobs, queries, and tables to optimize both costs and speed of execution
- Manage infrastructure through Terraform
- Share and propose good practices
- Decommission useless infrastructures such as services, tables, or virtual machines
- Track future deployments with a Data Architect and participate in Deployment Reviews
- Share and propose good practices of deployment
- Accompany Data Engineers during the entire process of deployments
- Accompany Data Engineers in the following period of active monitoring
- Ensure diligent application of deployment process, logging, and monitoring strategy
- Take over newly deployed flows in the run process
Preferred Qualifications
- Experience with Google Composer
- Apache Spark expertise; some of our pipelines use pySpark
- Pub/Sub knowledge
- Kafka knowledge
- Azure Analysis Services knowledge
- Google Cloud Storage optimization knowledge
Share this job:
Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.
Similar Remote Jobs
- πUnited States
- πUnited States
- πWorldwide
- πSpain
- π°$200k-$240kπUnited States
- π°$110k-$150kπUnited States
- πWorldwide
- πBrazil
- πPoland
Please let Alter Solutions Portugal know you found this job on JobsCollider. Thanks! π