Summary
Join our team as a Data Engineer and design, implement, and maintain data pipelines for cloud projects. You will work with complex data sources, transforming them for analysts' use. This role demands a Bachelor's degree and 5 years of experience in cloud computing, software engineering, and data processing. Proficiency in various tools and technologies, including ETL tools, Hadoop-based platforms, RDBMS, and programming languages like Python and SQL, is essential. Strong analytical, problem-solving, and communication skills are also required. The ability to manage multiple projects and adapt to a changing environment is crucial.
Requirements
- Have a Bachelor's Degree
- Have 5 years of experience
- Possess a strong background in cloud computing, software engineering and data processing
- Have data management experience
- Have experience in ETL Tools such as Pentaho, Talend, Informatica, Azure Data Factory, Apache Kafka and Apache Camel
- Have experience designing and implementing analysis solutions on Hadoop-based platforms such as Cloudera Hadoop, or Hortonworks Data Platform or Spark based platforms such as Databricks
- Be proficient in RDBMS such as Oracle, SQL Server, DB2, MySQL etc
- Possess strong analytical and problem-solving skills
- Possess strong verbal and written communication skills
- Have proficient programming skills in Python, SQL NoSQL, and Spark
- Have the ability to manage multiple projects
- Have the ability to work independently or in groups
- Have the ability to prioritize time
- Have the ability to adapt to a rapidly changing environment
Responsibilities
- Be responsible for design, development and maintenance of data pipelines to enable data analysis and reporting
- Build, evolve and scale out infrastructure to ingest, process and extract meaning out data
- Write complex SQL queries or python code to support analytics needs
- Manage projects / processes, working independently with limited supervision
- Work with structured and unstructured data from a variety of data stores, such as data lakes, relational database management systems, and/or data warehouses
- Combine, optimize, and manage multiple big data sources
- Build data infrastructure and determine proper data formats to ensure data is ready for use
Preferred Qualifications
- Have experience, education, licensure(s), specialized certification(s)
- Have Cloud Certifications in AWS, Azure or GCP