Aws Data Engineer

OZ Logo

OZ

πŸ“Remote - United States

Summary

Join OZ as a Data Engineer and transform raw data into valuable data systems using various methods, including algorithm creation and statistical analysis. You will strive for efficiency by aligning data systems with business goals. To succeed, you'll need strong analytical skills and the ability to combine data from different sources. This role involves designing and implementing highly available data services and pipelines within a team. OZ offers a remote-work friendly environment, competitive compensation, and a comprehensive benefits package.

Requirements

  • 5+ years of work experience with ETL, Data Modeling, and Data Architecture
  • Expert-level skills in writing and optimizing SQL
  • Experience with Big Data technologies such as Hadoop/Hive/Spark
  • Solid Linux skills
  • Experience operating very large data warehouses or data lakes
  • Expertise in ETL optimization, designing, coding, and tuning big data processes using Apache Spark or similar technologies
  • Experience with building data pipelines and applications to stream and process datasets at low latencies
  • Show efficiency in handling data - tracking data lineage, ensuring data quality, and improving discoverability of data
  • Sound knowledge of distributed systems and data architecture (lambda)- design and implement batch and stream data processing pipelines and know how to optimize the distribution, partitioning, and MPP of high-level data structures

Responsibilities

  • Analyze and organize raw data
  • Build data systems and pipelines
  • Evaluate business needs and objectives
  • Interpret trends and patterns
  • Conduct complex data analysis and report on results
  • Prepare data for prescriptive and predictive modeling
  • Build algorithms and prototypes
  • Combine raw information from different sources
  • Explore ways to enhance data quality and reliability
  • Identify opportunities for data acquisition
  • Develop analytical tools and programs
  • Collaborate with data scientists and architects on several projects

Preferred Qualifications

  • AWS Cloud
  • Redshift
  • S3
  • AWS Lake Formation
  • AWS Glue
  • AWS QuickSight
  • API development (specifically to connect to SaaS solutions)
  • Hadoop, Spark, Kafka, Linux scripting, etc
  • Experience with AWS/Azure cloud services and platforms
  • Experience with Snowflake data platform
  • Experience with Matillion, Databricks, or equivalent
  • Experience with object-oriented/object function scripting languages: Python, Java, etc

Benefits

  • Full health benefits
  • 401K
  • Unlimited PTO
  • Remote work

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.