Summary
Join Aryng as a Data Engineer and build enterprise-class distributed data engineering solutions on the cloud. You will implement asynchronous data ingestion, high volume stream data processing, and real-time data analytics. This role requires implementing application components using cloud technologies, defining data pipelines, and identifying bottlenecks. You will work with a team of data scientists, business analysts, and engineers to create effective solutions for clients. The ideal candidate has 3-5 years of data engineering experience, strong SQL and Python skills, and experience with cloud platforms (AWS preferred). This 100% remote role offers flexible hours and a competitive salary.
Requirements
- 3-5 years of data engineering experience is a must
- 3+ years implementing and managing data engineering solutions using Cloud solutions GCP/AWS/Azure or on-premise distributed servers. AWS is preferred
- Should be comfortable working and interacting with clients
- 2+ yearsβ experience in Python
- Must be strong in SQL and its concepts
- Experience in Big Query, Snowflake, Redshift, DBT
- Strong understanding of data warehousing, data lake, and cloud concepts
- Excellent communication and presentation skills
- Excellent problem-solving skills, highly proactive and self-driven
- Must have a B.S. in computer science, software engineering, computer engineering, electrical engineering, or a related area of study
Responsibilities
- Implement asynchronous data ingestion, high volume stream data processing, and real-time data analytics using various Data Engineering Techniques
- Implement application components using Cloud technologies and infrastructure
- Assist in defining the data pipelines and able to identify bottlenecks to enable the adoption of data management methodologies
- Implementing cutting edge cloud platform solutions using the latest tools and platforms offered by GCP, AWS, and Azure. (AWS is preferred)
- Requirement gathering, Client Mgt, team handling, Program delivery, Project Management (Project Estimation, Scope of Project, Agile methodology)
Preferred Qualifications
- Experience in some of the following: Apache Beam, Hadoop, Airflow, Kafka,Spark
- Experience in Tableau, Looker, or other BI tools is preferred
- Working knowledge of Airflow is preferred
- Consulting background is a big plus
Benefits
- Flexible work hours
- Competitive Salary
- 50%+ Tax Benefit
- 100% Remote company