Data Engineer

Dataroid Logo

Dataroid

πŸ“Remote - Turkey

Summary

Join Dataroid, Turkey's fastest-growing data analytics platform, as a Data Engineer. You will design and build large-scale, resilient data pipelines using various frameworks. This role requires experience in data engineering, data modeling, and ETL/ELT practices, along with proficiency in Python or Java. Dataroid offers a competitive compensation package including private health insurance, pension plans, meal vouchers, remote work benefits, and flexible working hours. Opportunities for professional development and a thriving company culture are also provided.

Requirements

  • BSc/MSc/PhD degree in Computer Science or a related field or equivalent work experience
  • 2+ years of experience in Data Engineering or similar role
  • Experience with data modeling and ETL/ELT practices
  • Experience with one or more high-level Python or Java based batch and/or stream processing frameworks such as Apache Spark , Apache Flink or Kafka Streams
  • Experience with relational and non-relational data stores, key-value stores and search engines ( PostgreSQL , ScyllaDB , Druid , ClickHouse , Redis , Hazelcast , Elasticsearch etc.)
  • Familiarity with data workflow orchestration tools like Airflow or dbt
  • Knowledge of storage formats such as Parquet , ORC and/or Avro
  • Proficiency with Python or Java
  • Proficiency in code versioning tools such as Git
  • Strong sense of analytical thinking and problem-solving skills
  • Strong verbal and written communication skills

Responsibilities

  • Design and build large-scale resilient data pipelines using various frameworks like Apache Spark, Apache Flink, Kafka etc
  • Write well designed, reusable, testable, secure and scalable high-quality code
  • Collaborate with cross-functional teams
  • Discover, learn and implement new technologies

Preferred Qualifications

  • Familiarity with distributed storage systems like HDFS and/or S3
  • Familiarity with data lake and data warehouse solutions including Hive , Iceberg and/or Delta Lake
  • Familiarity with distributed systems and concurrent programming
  • Familiarity with containerization & orchestration - Docker and/or Kubernetes
  • Experience or willing to learn large scale stream processing technologies
  • Familiarity with generative models and a strong enthusiasm for generative AI, large language models (LLMs) and the agentic world
  • Prior experience with SCRUM/Agile methodologies

Benefits

  • Private health insurance
  • Company-supported pension plans
  • Meal vouchers
  • Commute assistance
  • Remote work benefits
  • A paid day off for your birthday
  • Adaptable working hours
  • Access to premier online learning platforms like Udemy, digital libraries, and tailored training programs

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.