Lead/Senior Data Engineer

phData Logo

phData

πŸ“Remote - United States

Summary

Join phData, a leading innovator in the modern data stack, partnering with major cloud platforms. We're a remote-first global company fostering a collaborative and supportive work environment. We're seeking a highly skilled Data Engineer with 4+ years of experience in designing and implementing data solutions. The ideal candidate will possess strong programming skills, experience with cloud data platforms, and excellent communication abilities. We offer competitive compensation, excellent benefits, and opportunities for professional development. phData is an award-winning workplace committed to diversity and inclusion.

Requirements

  • 4+ years of experience as a hands-on Data Engineer and/or Software Engineer designing and implementing data solutions
  • Programming expertise in Java, Python, and/or Scala, including experience with the software development life cycle, including unit and integration testing
  • Experience with core cloud data platforms including Snowflake, AWS, Azure, Databricks, and GCP
  • Experience using SQL and the ability to write, debug, and optimize SQL queries
  • Client-facing written and verbal communication skills and experience
  • 4-year Bachelor's degree in Computer Science or a related field

Responsibilities

  • Develop end-to-end technical solutions into production and ensure performance, security, scalability, and robust data integration
  • Multitask, prioritize, and work across multiple projects simultaneously
  • Create and deliver detailed presentations
  • Develop detailed solution documentation (e.g., POCs and roadmaps, sequence diagrams, class hierarchies, logical system views, etc.)

Preferred Qualifications

  • Production experience in core data platforms: Snowflake (including Snowflake Native Apps), AWS, Azure, GCP, Hadoop, Databricks, IICS
  • Experience with Cloud and Distributed Data Storage: S3, ADLS, HDFS, GCS, Kudu, ElasticSearch/Solr, Cassandra or other NoSQL storage systems
  • Experience with Data integration technologies: Spark, Kafka, event/streaming, Streamsets, Matillion, Fivetran, NiFi, AWS Data Migration Services, Azure DataFactory, Informatica Intelligent Cloud Services (IICS), Google DataProc or other data integration technologies
  • Experience with Multiple data sources (e.g., queues, relational databases, files, search, API)
  • Complete software development lifecycle experience including design, documentation, implementation, testing, and deployment
  • Experience with Automated data transformation and data curation: dbt, Spark, Spark streaming, automated pipelines
  • Experience with Workflow Management and Orchestration: Airflow, AWS Managed Airflow, Luigi, NiFi

Benefits

  • Remote-First Work Environment
  • Casual, award-winning small-business work environment
  • Collaborative culture that prizes autonomy, creativity, and transparency
  • Competitive comp, excellent benefits, 4 weeks PTO plus 10 Holidays (and other cool perks)
  • Accelerated learning and professional development through advanced training and certifications

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.

Similar Remote Jobs