Summary
Join Pagos as a Data Engineer and play a key role in building and maintaining the platform powering our products. Collaborate with engineers and analysts to build and own new features and system extensions. You will craft high-quality, scalable code, design and maintain data pipelines, build integrations with data providers, and drive engineering projects. We seek an action-oriented problem solver who thrives in ambiguity and is committed to growth. This role requires significant experience in data engineering and related technologies.
Requirements
- 8+ years of software engineering experience with an emphasis on Data Engineering
- Bachelorβs degree or higher in Computer Science or related technical discipline (or equivalent experience)
- Advanced experience with complex SQL queries and database/lakehouse technologies such as Redshift, Apache Iceberg and Postgres
- Deep experience with big data technologies and frameworks such as Apache Spark, DBT, as well as data quality tools, like DBT (test)
- Familiarity with cloud platforms like AWS, GCP, or Azure, and common data-related services (e.g. S3, Redshift, EMR, Glue, Kinesis, Athena)
- A bias for action, where no task is too small, and an eagerness to learn and grow with our industry
Responsibilities
- Craft high-quality code for scale, availability, and performance
- Design, develop, and maintain scalable data pipelines and processes to extract, process, and transform large volumes of data, both real-time and batched (ELT/ETL)
- Build and maintain integrations with data providers using various data transfer protocols
- Drive engineering projects from start to finish with a high level of ownership and autonomy
- Ensure the quality of our products and data through both manual and automated testing, as well as code reviews
Preferred Qualifications
- Experience with real-time streaming frameworks like Apache Kafka
- Experience with Great Expectations and/or Soda
- Comfort and/or past experience working and managing big data and ELT pipelines
- Comfort and/or past experience working with Temporal, Apache Airflow or similar orchestration tools
- Experience working in high-growth, venture-backed startup(s)