Summary
Join GetInData | Part of Xebia, a leading data company, as an MLOps Engineer. You will streamline machine learning project lifecycles by designing and automating workflows, implementing CI/CD pipelines, and ensuring reproducibility. Collaborate with stakeholders and platform engineers to set up infrastructure, automate model deployment, monitor models, and scale training. This role requires proficiency in Python, experience with orchestration tools, and knowledge of ML algorithms and distributed training. You will work with GCP, BigQuery, Kubeflow, and Vertex AI. GetInData offers a competitive salary, 100% remote work, flexible hours, and opportunities for professional development.
Requirements
- Proficiency in Python, as well as experience with scripting languages like Bash or PowerShell
- Knowledge of at least one orchestration and scheduling tool, for example, Airflow, Prefect, Dagster, etc
- Understanding of ML algorithms and distributed training, e.g., Spark / PyTorch / TensorFlow / Dask / Ray
- Experience with GCP and BigQuery DWH platform
- Hands-on experience with Kubeflow and Vertex AI
- Familiarity with tools like MLFlow from the operations perspective
- Experience with containerization technologies like Docker and knowledge of container orchestration platforms like Kubernetes
- Understanding of continuous integration and continuous deployment (CI/CD) practices
- Ability to identify and analyze problems in the workflow (in all the teams involved), propose solutions, and navigate complex technical challenges
Responsibilities
- Creating, configuring, and managing GCP and K8s resources
- Managing Kubeflow and/or Vertex AI and its various components
- Collaborating and contributing to various GitHub repositories: infrastructure, pipelines, Python apps, and libraries
- Containerization and orchestration of Python DS/ML applications: Data/Airflow and ML/Kubeflow pipelines
- Setting up logging, monitoring, and alerting
- Profiling Python code for performance
- Scaling, configuring, and reconfiguring all the components based on metrics
- Working with Data (BigQuery, GCS, Airflow), ML (Kubeflow/Vertex), and GCP infrastructure
- Streamlining processes and making the Data Scientists' work more effective
Benefits
- Salary: 160 - 200 PLN net + VAT/h B2B (depending on knowledge and experience)
- 100% remote work
- Flexible working hours
- Possibility to work from the office located in the heart of Warsaw
- Opportunity to learn and develop with the best Big Data experts
- International projects
- Possibility of conducting workshops and training
- Certifications
- Co-financing sport card
- Co-financing health care
- All equipment needed for work
Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.