Remote Snowflake Data Engineer

Logo of iSoftTek Solutions

iSoftTek Solutions

πŸ“Remote - United States

Job highlights

Summary

The job is for a remote Snowflake Data Engineer position that lasts for 2 years. The role involves working on the Platform Engineering Team to solve the Multiplicity Problem and support the Financial Trading Industry. Responsibilities include designing, developing, and maintaining data pipelines, implementing ETL processes, optimizing data warehouse performance, integrating Snowflake with external systems, monitoring query performance, and handling SQL language and cloud-based technologies.

Requirements

  • Work on Snowflake modeling – roles, databases, schemas, ETL tools with cloud-driven skills
  • Work on SQL performance measuring, query tuning, and database tuning
  • Handle SQL language and cloud-based technologies
  • Set up the RBAC model at the infra and data level
  • Work on Data Masking / Encryption / Tokenization, Data Wrangling / ECreLT / Data Pipeline orchestration (tasks)
  • Setup AWS S3/EC2, Configure External stages, and SQS/SNS
  • Perform Data Integration e.g. MSK Kafka connect and other partners like Delta Lake (data bricks)

Responsibilities

  • Design, develop, and maintain data pipelines to ingest, transform, and load data from various sources into Snowflake
  • Implement ETL (Extract, Transform, Load) processes using Snowflake's features such as Snowpipe, Streams, and Tasks
  • Design and implement efficient data models and schemas within Snowflake to support reporting, analytics, and business intelligence needs
  • Optimize data warehouse performance and scalability using Snowflake features like clustering, partitioning, and materialized views
  • Integrate Snowflake with external systems and data sources, including on-premises databases, cloud storage, and third-party APIs
  • Implement data synchronization processes to ensure consistency and accuracy of data across different systems
  • Monitor and optimize query performance and resource utilization within Snowflake using query profiling, query optimization techniques, and workload management features
  • Identify and resolve performance bottlenecks and optimize data warehouse configurations for maximum efficiency

Preferred Qualifications

Work closely with Data Wrangling, ETL, Talend, Jasper, Java, Python, Unix, AWS, Data Warehousing, Data Modeling, Database Migration, ECreLT, RBAC model, Data migration

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.
Please let iSoftTek Solutions know you found this job on JobsCollider. Thanks! πŸ™