Senior Snowflake Data Engineer

Deutsche Telekom IT Solutions Logo

Deutsche Telekom IT Solutions

πŸ“Remote - Hungary

Summary

Join Deutsche Telekom IT Solutions, Hungary’s most attractive employer in 2025, as a skilled Snowflake Data Engineer. Develop, manage, and optimize Snowflake data models and pipelines across multiple environments. Implement zero-copy cloning, data masking, and row access policies. Collaborate on CI/CD pipelines using Terraform and dbt. Enable data provisioning to Power BI. Troubleshoot data issues, optimize queries, and manage deployments. Integrate data from various sources, including SAP PTB and Azure Blob Storage. Ensure data validation and compliance with governance and security policies. Support junior developers and analysts. Remote work is possible within Hungary.

Requirements

  • 3–5 years of hands-on experience with Snowflake , including role-based access control, zero-copy cloning, and multi-environment structures
  • Proficient in SQL for Snowflake , dbt , and data modeling (star/snowflake schema, views, materialized views)
  • Familiarity with CI/CD pipelines using Terraform , dbt , and version control (Git)
  • Experience working with semi-structured data (JSON, Parquet) and optimizing for fast query performance
  • Experience integrating with Power BI (Import mode and DirectQuery)
  • Knowledge of data masking and privacy-compliant data workflows
  • Experience using Azure Data Factory or similar ETL orchestration tools
  • Familiarity with enterprise data governance and secure data sharing across environments
  • Strong troubleshooting skills and experience with production support operations

Responsibilities

  • Develop, manage, and optimize Snowflake data models and pipelines across six structured environments (DEV, UAT, DATA_DEV, PREPROD, PROD, DATA_RESTRICTION)
  • Implement zero-copy cloning , data masking , and row access policies for privacy-compliant development workflows
  • Collaborate on CI/CD pipelines using Terraform and dbt to deploy and validate infrastructure and data transformations
  • Enable data provisioning to Power BI via Import and DirectQuery modes, optimizing for performance and consistency
  • Perform end-to-end support : troubleshoot data issues, optimize queries, and manage deployments in UAT/PREPROD/PROD
  • Integrate data from SAP PTB , Xtract Universal , Azure Blob Storage , and external APIs via data orchestration tools like Azure Data Factory
  • Ensure smooth data validation workflows in PREPROD and monitor pipelines across the environments
  • Work with platform architects to implement state-of-the-art architectural principles (cost efficiency, scalability, modularity)
  • Ensure compliance with governance and security policies, including role-based access controls in Snowflake
  • Support onboarding and enablement of junior developers and analysts working with Snowflake or consuming tools

Preferred Qualifications

  • Experience with SAP data extraction using tools like Xtract Universal
  • Understanding of Azure B2C , Tardis , and RESTful API integration
  • Familiarity with vendor lock-in mitigation and cloud cost optimization strategies
  • Knowledge of data provisioning for benchmarking and interoperability with partners
  • Experience working in a regulated enterprise environment (e.g. telecom, finance, government)

Benefits

  • Please be informed that our remote working possibility is only available within Hungary due to European taxation regulation
  • Please be informed that our remote working possibility is only available within Hungary due to European taxation regulation

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.

Similar Remote Jobs