Principal ETL Engineer

DigiCert
Summary
Join DigiCert as a Principal ETL Engineer and contribute to a leading global security authority. Research data sources, compile SQL for data warehouse integration, and review ETL requirements to build data deliverables. Analyze SQL queries for performance improvements, perform data validation and modeling, and design ETL pipelines. Convert business requirements into technical specifications, resolve performance bottlenecks, and improve ETL processes. Build processes supporting data transformation, metadata, and workload management, and write and validate DML, DDL, and DCL scripts. Modify ETL feeds to accommodate changing business processes, provide data warehouse support, and troubleshoot operations. Collaborate with stakeholders, and travel is required 12% of the time. The position is 100% remote, reporting to HQ in Lehi, UT.
Requirements
- Bachelorβs degree or U.S. equivalent in Information Technology, Computer Science, Computer Engineering, Computer Information Systems, or related field, plus 5 years of professional experience as a Data Analyst, Software Developer, or any occupation/position/job title involving performing data validation and data modeling to integrate data into data warehouses
- 3 years of post-secondary studies in Information Technology, Computer Science, Computer Engineering, Computer Information Systems, or related field, plus 6 years of professional experience as a Data Analyst, Software Developer, or any occupation/position/job title involving performing data validation and data modeling to integrate data into data warehouses
- 5 years of professional experience building and optimizing SQL queries and data deliverables per stakeholder requirements
- 5 years of professional experience utilizing Linux based systems
- 5 years of professional experience utilizing Relational and Non-Relational data systems
- 5 years of professional experience performing data quality, quality assurance (QA) and data validation
- 5 years of professional experience performing data modeling and data warehouse design
- 5 years of professional experience working with large scale data structures and pipelines
- 5 years of professional experience utilizing GIT for version control
- 5 years of professional experience utilizing API connections and concepts
Responsibilities
- Research data sources and compile SQL to integrate data into data warehouse
- Review ETL requirements and build data deliverables per business unit requirements
- Analyze existing SOL queries for performance improvements working alongside Bl Analysts and developers
- Perform data validation, and data modeling to maintain optimal data warehouse
- Design and implement ETL pipelines to support and enhance the data warehouse
- Convert business requirements to technical specifications and implement them
- Identify and resolve performance bottlenecks and work with back-end pipeline developers on implementation
- Work with stakeholders to continually improve ETL process and pre-emptively identify potential concerns
- Build processes supporting data transformation, metadata, dependency and workload management
- Write, execute, and validate DML, DDL, and DCL scripts to meet business and customer needs
- Modify ETL feeds to accommodate ever changing business processes
- Provide day-to-day support of the data warehouse and troubleshoot existing processes and operations
- Share team responsibilities for all aspects involved with maintaining a high level of uptime
- Communicate with stakeholders and effectively work on projects while keeping them in the loop on progress
Share this job:
Similar Remote Jobs





