Data Engineer

Transfr
Summary
Join Transfr as a Data Engineer and build data solutions from various sources to facilitate data-driven decisions on usability, learning, product engagement, and quality. Collaborate with researchers, engineers, and product designers, contributing to learning research and analytics. This role requires embodying Transfr's seven cultural values. You will build data infrastructure using GCP technologies, create real-time data solutions for A/B testing, and maintain data pipeline architecture. Work with internal teams to address data-related issues and support their data infrastructure needs. Adhere to data privacy and security guidelines and build data expertise, owning data quality for assigned areas. Create efficient processes for data acquisition, extraction, integration, transformation, and modeling.
Requirements
- 4+ years of experience as a data engineer, analytics engineer, data analyst, or similar role focused on data architecture and warehouse development
- Proven expertise in SQL, Python, and cloud platform technologies (we use GCP)
- Deep understanding of data warehousing, modeling, and real-time analytics to support experimentation (e.g., A/B testing) and product intelligence
- Proficient in modern data tooling (e.g., BigQuery, dbt, Airflow, Pub/Sub) and version control systems (e.g., Git)
- Knowledge of robust, scalable ETL/ELT pipelines and streaming data solutions
- Knowledge of data visualization and BI tools such as Looker, Tableau, or similar
- Familiarity with data privacy, governance, and security best practices, including compliance frameworks (e.g., GDPR, FERPA)
- Strong communication skills with the ability to translate complex data concepts into clear, actionable insights for technical and non-technical stakeholders
- Demonstrated success working cross-functionally with product managers, researchers, engineers, and designers
- Experience contributing to data documentation and mentoring teammates on best practices and data strategy
- Self-starter who takes radical responsibility for end-to-end data solution delivery
- Proven ability to work in high-collaboration, team-oriented environments and thrive on shared success
- A creative problem solver who embraces ambiguity and seeks novel approaches to challenging data problems
- Strong empathy for data users and consumers, with a focus on usability and clarity in data access and reporting
- Demonstrated humility and openness to feedback, continuous improvement, and collaborative experimentation
Responsibilities
- Work with stakeholders to map business definitions to logical definitions that can be modeled in DBT
- Instrument data solutions from multiple sources that enable fast, data-driven decision making on issues including usability, learning, product engagement, and quality
- Build the infrastructure required for optimal extraction, transformation, and loading of data from multiple data sources using GCP technologies
- Build real-time, scalable data solutions that support A/B product testing and support data and learning science through enabling visualizations and views into complex data sets
- Create and maintain data pipeline architecture, configuration and implementation
- Work with internal teams to assist with data-related technical issues and support their data infrastructure, access, and visualization needs
- Support cross-functional teams with the creation of data strategies and data guidance documentation and architectures
- Adhere to data privacy and security guidelines and regulations analysis
- Build data expertise and own data quality for allocated areas
- Create efficient processes for acquiring, extracting, integrating, transforming, and modeling data to derive useful information
Benefits
- Stock options
- 401(k)
- Paid vacation and sick time
- Health benefits