Solutions Architect

phData Logo

phData

πŸ“Remote - United States

Summary

Join phData, a leading innovator in the modern data stack, partnering with major cloud data platforms. We are a remote-first global company committed to helping global enterprises overcome their toughest data challenges. We offer a casual, exciting work environment with autonomy for top performers. We've received multiple awards and recognition for our workplace culture. This Solutions Architect/Data Engineer role requires extensive experience in designing and implementing data solutions, utilizing various technologies and platforms. The ideal candidate will possess strong technical skills, leadership abilities, and excellent communication skills. We offer competitive compensation, excellent benefits, and professional development opportunities.

Requirements

  • Possess 8+ years of experience as a hands-on Solutions Architect and/or Data Engineer
  • Hold a 4-year Bachelor's degree in Computer Science or a related field

Responsibilities

  • Serve as a hands-on Solutions Architect and/or Data Engineer, designing and implementing data solutions
  • Lead and mentor other engineers
  • Develop end-to-end technical solutions into production, ensuring performance, security, scalability, and robust data integration
  • Demonstrate programming expertise in Java, Python, and/or Scala
  • Utilize core cloud data platforms including Snowflake, AWS, Azure, Databricks, and GCP
  • Write, debug, and optimize SQL queries
  • Exhibit strong client-facing written and verbal communication skills and experience
  • Create and deliver detailed presentations
  • Produce detailed solution documentation (e.g., POCs and roadmaps, sequence diagrams, class hierarchies, logical system views, etc.)

Preferred Qualifications

  • Have production experience in core data platforms: Snowflake, AWS, Azure, GCP, Hadoop, Databricks
  • Have experience with Cloud and Distributed Data Storage: S3, ADLS, HDFS, GCS, Kudu, ElasticSearch/Solr, Cassandra or other NoSQL storage systems
  • Have experience with Data integration technologies: Spark, Kafka, event/streaming, Streamsets, Matillion, Fivetran, NiFi, AWS Data Migration Services, Azure DataFactory, Informatica Intelligent Cloud Services (IICS), Google DataProc or other data integration technologies
  • Have experience with Multiple data sources (e.g., queues, relational databases, files, search, API)
  • Have complete software development lifecycle experience including design, documentation, implementation, testing, and deployment
  • Have experience with Automated data transformation and data curation: dbt, Spark, Spark streaming, automated pipelines
  • Have experience with Workflow Management and Orchestration: Airflow, AWS Managed Airflow, Luigi, NiFi

Benefits

  • Remote-First Work Environment
  • Casual, award-winning small-business work environment
  • Collaborative culture that prizes autonomy, creativity, and transparency
  • Competitive comp, excellent benefits, 4 weeks PTO plus 10 Holidays (and other cool perks)
  • Accelerated learning and professional development through advanced training and certifications

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.

Similar Remote Jobs