Forward Deployed Data Engineer

TRM Labs Logo

TRM Labs

📍Remote - United States

Summary

Join TRM Labs, a blockchain intelligence company, as a Forward Deployed Engineer and work directly with government customers. You will bridge the company's SaaS offering with federated capabilities and cloud architectures, from setup and implementation to configuration and maintenance. This role involves designing and deploying secure, scalable cloud-based solutions on AWS, building ETL/ELT pipelines, containerizing and deploying services on Kubernetes, and designing integrations with various data sources. You will participate in all phases of the software development lifecycle and implement observability solutions. The position requires a Bachelor's degree, 4+ years of experience building data pipelines in Python, and expertise with Apache Airflow and Spark. You will also support mission-critical systems and work closely with customer operations teams. This is an opportunity to make a significant impact on global security and financial systems.

Requirements

  • Bachelor's degree (or equivalent) in Computer Science, Engineering, or a related field
  • 4+ years of hands-on experience building and deploying data pipelines in Python
  • Proven expertise with Apache Airflow (DAG development, scheduler tuning, custom operators)
  • Strong knowledge of Apache Spark (Spark SQL, DataFrames, performance tuning)
  • Deep SQL skills—able to optimize queries with window functions, CTEs, and large datasets
  • Professional experience deploying cloud-native architectures on AWS, including services like S3, EMR, EKS, IAM, and Redshift
  • Familiarity with secure cloud environments and experience implementing FedRAMP/FISMA controls
  • Experience deploying applications and data workflows on Kubernetes, preferably EKS
  • Infrastructure-as-Code proficiency with Terraform or CloudFormation
  • Skilled in GitOps and CI/CD practices using Jenkins, GitLab CI, or similar tools
  • Excellent verbal and written communication skills—able to interface confidently with both technical and non-technical stakeholders
  • Willingness and ability to travel up to 25% to client sites as needed
  • Active TS/SCI clearance required (Polygraph strongly preferred)

Responsibilities

  • Partner directly with mission-focused customers to design and deploy secure, scalable cloud-based data lakehouse solutions on AWS (e.g., S3, EMR/EKS, Iceberg or Delta Lake)
  • Own and deliver production-ready ETL/ELT pipelines using Python, Apache Airflow, Spark, and SQL—optimized for petabyte-scale workloads
  • Containerize and deploy services on Kubernetes (EKS), using Terraform or CloudFormation for Infrastructure-as-Code and repeatable environments
  • Design integrations that ingest data from message buses, APIs, and relational databases, embedding real-time analytics capabilities into client workflows
  • Actively participate in all phases of the software development lifecycle: requirements gathering, architecture, implementation, testing, and secure deployment
  • Implement observability solutions (e.g., Prometheus, Datadog, NewRelic) to uphold SLAs and drive continuous improvement
  • Support mission-critical systems in production environments—resolving incidents alongside customer operations teams

Benefits

  • The estimated base salary range for this role is $200,000 - $275,000
  • Additionally, this role may be eligible to participate in TRM’s equity plan
  • Work alongside top experts and learn every day
  • Embrace a growth mindset with development opportunities tailored to your role
  • Take on high-impact challenges in a fast-paced, collaborative environment
  • As a remote-first company, TRM Labs is built for global collaboration

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.