Data Engineer

Remedy Product Studio
Summary
Join Remedy Product Studios, a rapidly scaling technology company, as a Senior Data Engineer. You will design, implement, and maintain scalable data pipelines and storage solutions, working closely with various teams to ensure efficient data workflows. The ideal candidate possesses a strong data engineering background with AWS expertise and experience with big data technologies. Responsibilities include developing ETL/ELT pipelines, optimizing data storage, ensuring data integrity and compliance, and collaborating on data requirements. The role requires 5+ years of experience in data engineering, strong AWS skills, and proficiency in SQL and Python. Excellent communication and problem-solving skills are essential. Remedy offers a competitive compensation package including paid time off, medical insurance, and professional development opportunities.
Requirements
- 5+ years of experience in data engineering or a related field
- Strong expertise in AWS data services (e.g., Redshift, Glue, Athena, S3, Lambda, Kinesis, etc.)
- Experience with big data technologies such as Apache Spark, Apache Airflow, or similar tools
- Strong knowledge of SQL and data modeling, schema design
- Strong hands-on experience in Python
- Strong problem-solving skills and ability to work in a fast-paced environment
- Excellent communication skills and ability to collaborate with technical and non-technical stakeholders
- Fluent English (B2) with both written and spoken skills used for clear and active communication with native speakers
Responsibilities
- Design, develop, and maintain scalable ETL/ELT pipelines and data workflows
- Optimize data storage, processing, and retrieval using cloud-based technologies, with a strong focus on AWS
- Work with structured and unstructured data across various databases and data lakes
- Ensure data integrity, security, and compliance with industry standards
- Collaborate with cross-functional teams to define data requirements and support analytics and machine learning initiatives
- Monitor and improve data infrastructure performance, ensuring reliability and scalability
- Implement best practices for data governance, quality, and observability
- Develop and maintain robust data injection processes to ensure seamless data flow across systems
- Design and optimize data models to support analytical and operational use cases
Preferred Qualifications
- Experience with real-time data streaming and event-driven architectures
- Exposure to machine learning workflows and MLOps practices
- Knowledge of security and compliance standards in data engineering
- Experience with GCP or Azure is a plus (BigQuery, Databricks, Azure Data Factory, etc.)
Benefits
- Vacation leave of 20 working days
- Paid Medical insurance
- 4 sick days per year
- Sport compensation
- Personal Growth
- Career and professional growth
- Access to senior engineers for help and mentorship
- English classes with on-staff teachers
- English clubs where you can polish your skills together with teammates
- Paid courses and certificates to raise professional skills
- Possibility for working remote or in Warsaw office
- Macbook provided by company as a working environment
- Work environment based on collaboration and teams
- Team building events on team and group level
- Sponsored company outings and social events