Senior Data Engineer

Workato Logo

Workato

πŸ“Remote - Bulgaria

Summary

Join Workato's data engineering team as a Senior Data Engineer and contribute to maintaining two crucial data circuits: one for internal analytics and another for in-product analytics. Your primary responsibility will be developing a new, near real-time usage tracking/billing platform integrated with various internal systems. This platform will support advanced use cases like usage forecasting and anomaly detection. You will collaborate closely with a small, flexible team, utilizing modern tools like Snowflake, Clickhouse, and Airflow. Workato offers a vibrant and dynamic work environment with numerous benefits. The role requires extensive experience in data engineering, proficiency in SQL and various programming languages, and strong communication skills.

Requirements

  • 5+ years of work experience building & maintaining data pipelines on data-heavy environments (Data Engineering, Backend with emphasis on data processing)
  • Fluent knowledge of SQL
  • Strong knowledge of common analytical domain programming languages such as Java, Scala and basic knowledge of Python
  • Strong experience with Flink and Spark
  • Experience with Data Pipeline Orchestration tools (Airflow, Dagster or similar)
  • Experience with Data Warehousing Solutions (Snowflake, Redshift, BigQuery)
  • Confidence in using Git, K8s and Terraform
  • Good understanding of Data Privacy and Security (GDPR, CCPA)
  • Good communication and collaboration skills
  • Readiness to work remotely with teams distributed across the world and timezones
  • Spoken English (at the level enough to pass technical interviews and later work with colleagues)

Responsibilities

  • Maintain two data circuits: one for internal analytics and another for in-product analytics
  • Develop a new usage tracking/billing platform providing accurate near real-time data for both circuits
  • Integrate the new platform with the back office, internal data warehouse, and in-product analytical and reporting tool (Workato Insights)
  • Address advanced use cases like usage forecasting, anomaly detection, and real-time alerts
  • Collaborate closely with the ML team
  • Work as part of a small, flexible team of 4 engineers
  • Have a direct impact on modernizing and maturing the platform, including architecture decisions
  • Use modern tools such as Iceberg Datalake, Snowflake, Clickhouse, Airflow, Flink, Spark, Trino, and Kafka Ecosystem on a daily basis

Preferred Qualifications

  • Production experience with PostgreSQL and Clickhouse
  • Experience with building billing/usage tracking platforms
  • Experience with a solution cost optimization and capacity planning
  • Exposure or interest working with Data pipeline technologies

Benefits

  • Vibrant and dynamic work environment
  • A multitude of benefits they can enjoy inside and outside of their work lives

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.