Data Engineer
NBCUniversal
Job highlights
Summary
Join NBCUniversal's global Operations & Technology organization as a Data Engineer and contribute to building next-generation data pipelines and applications supporting generative AI initiatives. This role involves collaborating with various teams to define data requirements, execute ETL/ELT processes, design and scale data pipelines, and optimize internal processes. You will leverage cloud-native principles and apply design patterns to ensure performance and scalability. The ideal candidate possesses 3+ years of data engineering experience, proficiency in Python/SQL, and experience with cloud-based data warehouses. This fully remote position offers competitive compensation and a comprehensive benefits package, including medical, dental, vision insurance, 401(k), paid leave, and tuition reimbursement.
Requirements
- 3+ years of experience in data engineering, demonstrating a foundational understanding of data modeling, ETL/ELT principles, and data warehousing
- Experience with data management fundamentals, data storage principles, and cloud-based data warehouses such as cloud Storage (AWS S3, GCP Cloud Storage, Azure Blob Storage), GCP BigQuery, Snowflake, or similar platforms
- Proficiency in building data pipelines using Python/SQL
- Demonstrate experience with workflow orchestration tools like Airflow, or a willingness to learn
- Experience in applying CI/CD principles and processes to data engineering solutions
- General understanding of cloud data engineering design patterns and use cases
Responsibilities
- Engage with business leaders, engineers, and product managers to define and meet data requirements
- Work closely with technology teams to execute ETL/ELT processes, leveraging cloud-native principles to manage data from diverse sources
- Participate in the design, construction, and scaling of data pipelines, integrating data from various sources, including internal systems, third-party platforms, and cloud environments
- Support internal process optimizations by automating workflows, enhancing data delivery, and redesigning infrastructure to boost scalability
- Apply appropriate design patterns to ensure performance, cost-efficiency, security, scalability, and a positive end-user experience
- Be actively involved in development sprints, demonstrations, and retrospectives, contributing to the deployment and release processes
- Cultivate relationships with IT support teams to ensure the smooth deployment of work products
Preferred Qualifications
- A Bachelorβs degree in Computer Science, Data Science, Statistics, Informatics, Information Systems, Mathematics, Computer Engineering, or a related quantitative discipline is preferred
- Effective communication skills, capable of working collaboratively across diverse teams and navigating a large, matrixed organization efficiently
- Action-oriented β You're constantly figuring out new problems and are regularly showing results with a positive attitude, always displaying ethical behavior, integrity, and building trust
Benefits
- Medical, dental and vision insurance
- 401(k)
- Paid leave
- Tuition reimbursement
- A variety of other discounts and perks
Share this job:
Similar Remote Jobs
- π°$220k-$270kπUnited States
- πUnited States
- πKingdom of Saudi Arabia
- π°$175k-$210kπUnited States, Worldwide
- πIndia
- πIndia
- πIndia
- πWorldwide
- π°$225k-$255kπUnited States
- πMexico