Remote Data Engineer

Logo of CDC Foundation

CDC Foundation

πŸ’΅ $103k-$143k
πŸ“Remote - United States

Job highlights

Summary

Join the CDC Foundation as a Data Engineer to design, build, and maintain data infrastructure for a public health organization. This role will play a crucial part in advancing the CDC Foundation's mission by implementing high-performance data systems, ensuring reliability and scalability of our data infrastructure, and enabling robust analytics and insights for the organization.

Requirements

  • Bachelor's degree in Computer Science, Information Technology, Data Science, or a related field
  • Minimum 5 years of relevant professional experience
  • Proficiency in programming languages commonly used in data engineering, such as Python, Java, Scala, or SQL
  • Experience with big data technologies and frameworks like Hadoop, Spark, Kafka, and Flink
  • High level of proficiency in Snowflake, including advanced features like Time Travel, Zero-Copy Cloning, and data sharing is required
  • Experience regarding engineering best practices such as source control, automated testing, continuous integration and deployment, and peer review
  • Knowledge of data warehousing concepts and tools
  • Familiarity with data lake and lakehouse architectures
  • Expertise in data modeling, ETL (Extract, Transform, Load) processes, and data integration techniques
  • Familiarity with agile development methodologies, software design patterns, and best practices
  • Strong analytical thinking and problem-solving abilities
  • Excellent verbal and written communication skills, including the ability to convey technical concepts to non-technical partners effectively
  • Flexibility to adapt to evolving project requirements and priorities
  • Outstanding interpersonal and teamwork skills; and the ability to develop productive working relationships with colleagues and partners
  • Experience working in a virtual environment with remote partners and teams
  • Proficiency in Microsoft Office

Responsibilities

  • Develop a detailed plan for database migration, ETL processes, and data processing applications
  • Design, build, and manage ETL/ELT processes and data pipelines on the Snowflake platform, ensuring the movement of large datasets between various data sources
  • Develop efficient, scalable data architectures and implement Snowflake best practices, including partitioning, clustering, and query optimization for performance and cost
  • Collaborate with data scientists, analysts, and Local health departments to integrate diverse data sources into Snowflake, ensuring data is available for analytics and reporting
  • Monitor data pipelines and systems for performance issues, costs, errors and anomalies, and implement solutions to address them
  • Collaborate with the IT Security Team to conduct security and access testing. Implement security measures to protect sensitive information
  • Collaborate with cross-functional teams to understand data requirements and design scalable solutions that meet business needs
  • Implement and maintain ETL processes to ensure the accuracy, completeness, and consistency of data
  • Design and manage data storage systems, including relational databases, NoSQL databases, and data warehouses
  • Knowledgeable about industry trends, best practices, and emerging technologies in data engineering, and incorporating the trends into the organization's data infrastructure
  • Provide technical guidance to other staff. Create and maintain clear documentation for ETL processes, data pipelines, data models, and infrastructure setups
  • Develop training materials and conduct online sessions on accessing and utilizing shared data
  • Communicate effectively with partners at all levels of the organization to gather requirements, provide updates, and present findings
  • Create a data governance framework for secure and compliant data sharing
  • Establish successful connection migration plan for ETL processes and APIs between migrated applications and databases
  • Implement automated processes for data extraction from source systems and loading into the data warehouse
  • Migrate ETL processes and APIs to the cloud environment

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.
Please let CDC Foundation know you found this job on JobsCollider. Thanks! πŸ™