Analytical Data Engineer

The Motley Fool
Summary
Join The Motley Fool as a Freelance Analytical Data Engineer to prototype and build a new data product using exhaust data. This is a 40-hour-per-week contract role for at least 3 months, ideal for a mid-to-senior level engineer with 4+ years of experience. You will design, build, and document scalable data pipelines, ensure data quality and compliance, and package datasets for future commercialization. The role focuses on creating a new data product from internal exhaust data, collaborating with data and product teams. You will build data pipelines, query large datasets in Snowflake, debug pipeline issues, develop Airflow workflows, design data products, package and deliver datasets, and ensure data quality and privacy. The position requires strong experience with data pipelines, Snowflake, Airflow, SQL, Python, and AWS services.
Requirements
- Strong experience building data pipelines, ideally with Snowflake and Airflow
- Experience with Snowflake Cortex and LLMs
- Proficiency in SQL, including multi-table joins, CTEs, and window functions
- Experience developing in Python, particularly for REST API ingestion and data manipulation
- Familiarity with AWS services including Lambda, S3, and IAM
- Experience building and orchestrating pipelines in Airflow, including DAG design and task management
- Experience with Snowflake external stages, data shares, RBAC, and Snowpark-based task automation
- Ability to work independently, communicate clearly, and deliver results with minimal supervision
- Experience working with complex and time series datasets
- Experience designing and building data products for external consumption (APIs, public data shares, customer-facing datasets)
- Experience documenting and packaging datasets for external use (schemas, dictionaries, metadata, user guides)
- Familiarity with data monetization strategies
- Experience working with event tracking data, such as GA4
Responsibilities
- Build focused data pipelines and processing workflows to support the new data product
- Query and analyze large datasets in Snowflake using SQL
- Debug and resolve data pipeline issues throughout the prototyping lifecycle
- Develop workflows using Airflow (including custom operators) for scalable, modular data orchestration
- Design and prototype data products using internal exhaust/event data, working closely with stakeholders to identify monetizable opportunities
- Package and deliver datasets via Snowflake shares, APIs, or other external-facing interfaces
- Ensure data quality, governance, and privacy requirements are met for external use
- Implement data lineage and usage tracking to inform future improvements
- Document architecture, schema definitions, and user-facing metadata
- Collaborate across product, data, and business to validate product direction and dataset usability
- Stay current on tech trends and best practices in data product development and externalization
Preferred Qualifications
- Experience working with financial data, especially intraday time series
- Personal or professional experience using The Motley Foolβs services
- Familiarity with data quality tools (e.g., Great Expectations, Monte Carlo, Soda)
- Experience with DevOps/IaC tools like Terraform or CloudFormation
- Experience designing with data contracts and managing interface versioning
- Experience working with cross-functional teams, including data governance, product, legal, or compliance, to prepare data for external release
Benefits
$85 β $100 USD
Share this job:
Similar Remote Jobs
