Senior Analytics Engineer

Blockchain Education Network Logo

Blockchain Education Network

📍Remote - United States

Summary

Join the Uniswap Foundation as an Analytics Engineer and own the data transformation layer, converting raw data into reliable analytics-ready models. You will leverage existing tools like Dune's public dbt Spellbook and build custom pipelines using BigQuery/Snowflake. Collaborate with Data Analysts and Growth Managers to generate insights for grants and liquidity-mining programs, ensuring data quality and accessibility. Build and optimize data models, develop and maintain data pipelines, and implement monitoring and alerting systems. Partner with cross-functional teams to refine schemas and dashboards, centralize data sources, and plan and build in-house models. Champion best practices and stay current with emerging data engineering tools and cloud services.

Requirements

  • Engineering‑minded : you treat analytics transformations as production code robust, testable and maintainable
  • Future‑focused : adept with Dune Spellbook today and excited to build self‑hosted solutions tomorrow
  • Detail‑obsessed : you identify edge cases, troubleshoot upstream issues and prevent data drift proactively
  • Collaborative : you translate requirements into solutions and work seamlessly across small, cross‑functional teams

Responsibilities

  • Build & optimize data models (dbt or equivalent) for Uniswap, hook protocols and broader DEX metrics, ensuring accuracy, consistency and performance
  • Develop & maintain pipelines to Ingest onchain events, API feeds and third-party sources into Dune/BigQuery/Snowflake, with monitoring and alerting
  • Optimize pipeline health : Implement monitoring, alerting and root-cause workflows to quickly detect and resolve data issues
  • Collaborate & iterate : Partner with Data Analysts, Growth and Research teams to refine schema, metrics and dashboards making data intuitive to query and interpret
  • Centralize data sources : Merge disparate feeds into a unified repository while provisioning data to where it’s needed
  • Plan & build in-house models : As needed, gradually transition transformations into BigQuery or Snowflake design schemas, materializations and deployment workflows
  • Champion best practices : Contribute to open standards in the Uniswap and DEX communities
  • Stay current: Evaluate emerging data‑engineering tools and cloud services (BigQuery, Snowflake, AWS/GCP) and recommend enhancements to our stack

Preferred Qualifications

  • Proficiency with modern cloud platforms (e.g., BigQuery, Snowflake, AWS, GCP, or Azure) and experience with both OLTP and analytical databases such as PostgreSQL or ClickHouse
  • Experience building subgraphs or equivalent custom indexers (e.g., The Graph, Ponder)
  • Experience building and exposing internal/external Data APIs and deploying containerized workloads using Docker and Kubernetes
  • Advanced degree in Computer Science, Data Engineering, or a related technical field

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.

Similar Remote Jobs