Data Scientist

Wormhole Foundation
Summary
Join Wormhole Foundation as a Data Scientist and work across product, engineering, and business teams to extract insights from complex datasets. You will analyze on-chain and off-chain data, create statistical tooling to monitor network parameters, identify data-based opportunities to improve Wormhole's products, and collaborate on data pipelines. Your work will directly influence decision-making and product development. The ideal candidate has 4+ years of experience in data science, expertise with blockchain data, proficiency in programming languages (Python, Java, Scala), and experience with big data warehouse systems. The role requires experience with SQL and NoSQL, workflow orchestration, source control, data ingestion, and cloud service database management. A plus is experience with BI and visualization tools, machine learning, and open-source big data technologies.
Requirements
- 4+ years of industrial experience in data science
- Demonstrated experience expertise working with, procuring and tabulating on-chain data in a range of blockchain environments (particular focus on EVM and SVM)
- Proficient in one or more programming languages (Python, Java, Scala)
- Expertise with big data warehouse systems (Snowflake, BigQuery, Databricks, etc.)
- Experience in SQL and NoSQL
- Experience with workflow orchestration management and source control management
- Experience with data ingestion following ELT best practices
- Experience in building infrastructure to support batch, micro-batch or stream data processing at scale
- Experience with cloud service database management (GCP, Azure etc.)
Responsibilities
- Support strategic initiatives by analyzing on-chain and off-chain data
- Create statistical tooling to monitor and critically assess network parameters on an ongoing basis, including experimentation frameworks
- Identifying and executing on novel data-based opportunities to better Wormholeβs core product offerings
- Collaborate across the Wormhole ecosystem and external teams to create and maintain robust data pipelines
- Other special projects as assigned
Preferred Qualifications
- Experience with BI and visualization tools (Tableau, Looker etc.) is a plus
- Expertise in machine learning, statistics, and keen on experimentation
- Internal knowledge of open source or related big data technologies
- Motivated, creative self-starter excited to work in a meritocratic work environment
- Enjoy working with a remote-first team, and based in the US East, UK or Europe timezone