Summary
Join AccuWeather's Data Operations team as a Data Operations Engineer II. This mid-level role focuses on scripting and deployment automation to optimize data processes. You will develop and maintain complex scripts, improve data infrastructures, build monitoring systems, and collaborate with cross-functional teams. The ideal candidate possesses a Bachelor's degree, 3-5 years of experience in data operations and scripting, and advanced proficiency in scripting languages like Python. This position offers a pathway for career growth within the Data Operations team, with opportunities to advance to senior-level roles.
Requirements
- Bachelor's degree in Computer Science, Information Technology, or a related field
- Minimum of 3-5 years of professional experience in data operations, scripting, and automation
- Advanced proficiency in scripting languages such as Python, PowerShell, or similar
- Proficiency in Object-Oriented Programming (OOP) languages such as C++, Java, or similar
- In-depth understanding of data processing concepts and data integration tools
- Experience managing relational databases such as SQL
- Experience working with cloud platforms (e.g., AWS, Azure, GCP)
- Exceptional analytical and problem-solving skills with meticulous attention to detail
- Excellent communication skills with the ability to collaborate effectively within a team environment
- Ability to adapt to changing priorities and manage multiple tasks simultaneously
- Proactive mindset with a strong willingness to learn and explore new technologies
- Demonstrated ability to debug systems, tracing issues back to their source
Responsibilities
- Develop, maintain, and optimize complex scripts using languages such as Python, Bash, or similar, to automate data collection and monitoring
- Improve and maintain data infrastructures and ensure the reliability and efficiency of data processes
- Troubleshoot and resolve issues within these systems to ensure continuous operation and data integrity
- Build and manage end-to-end monitoring systems and automated alert mechanisms to ensure the health and performance of data pipelines
- Implement tools and processes to proactively identify and address potential issues before they impact operations
- Create and maintain comprehensive documentation of scripts, automation and monitoring workflows, data pipelines, and deployment procedures for knowledge sharing and future reference
- Collaborate closely with cross-functional teams, providing support for scripting needs and contributing to the development of an effective automation strategy
Preferred Qualifications
- Experience with DataDog for monitoring and performance tracking
- Familiarity with Databricks for data engineering and analytics
- Experience of version control systems (e.g., Git)
- Demonstrated experience related to scripting and automation in a data context
- Experience working with deployment automation (CI/CD) tools and processes
Benefits
- This mid-level position offers a pathway for career growth within our Data Operations team
- Successful candidates may have opportunities to advance to senior-level roles, gaining more responsibilities and specialization within the field of data engineering