Summary
Join phData, a leading innovator in the modern data stack, partnering with major cloud platforms. We're a remote-first global company committed to helping enterprises solve complex data challenges. We offer a casual, exciting work environment with autonomy for top performers. We're seeking a highly experienced Solutions Architect/Data Engineer with a proven track record of leading teams and delivering successful projects. The ideal candidate will possess strong technical skills, client-facing communication abilities, and a collaborative spirit. phData provides competitive compensation, excellent benefits, and professional development opportunities.
Requirements
- 6+ years of experience as a hands-on Solutions Architect and/or Data Engineer
- 2+ years of consulting experience managing projects for external customers
- Demonstrated expertise in leading and managing a team of Senior and Junior Data Engineers
- Ability to multitask, prioritize, and work across multiple projects at once
- Proven track record of collaborating with client stakeholders, technology partners, and cross-functional sales and delivery team members across distributed global teams
- Strong sense of ownership in resolving challenges
- Ability to develop end-to-end technical solutions into production
- Programming expertise in Java, Python and/or Scala
- Experience with core cloud data platforms including Snowflake, AWS, Azure, Databricks and GCP
- SQL proficiency (writing, debugging, and optimizing queries)
- Client-facing written and verbal communication skills and experience
- 4-year Bachelor's degree in Computer Science or a related field
Responsibilities
- Design and implement data solutions
- Manage projects for external customers
- Lead and manage a team of Senior and Junior Data Engineers
- Foster internal growth through coaching, mentoring, and performance management
- Multitask, prioritize, and work across multiple projects
- Collaborate with client stakeholders, technology partners, and cross-functional teams
- Ensure seamless, successful project delivery outcomes
- Resolve challenges and ensure exceptional project execution and delivery
- Develop end-to-end technical solutions into production
- Ensure performance, security, scalability, and robust data integration
- Create and deliver detailed presentations
- Develop detailed solution documentation (e.g., POCs and roadmaps, sequence diagrams, class hierarchies, logical system views, etc.)
Preferred Qualifications
- Production experience in core data platforms: Snowflake, AWS, Azure, GCP, Hadoop, Databricks
- Experience with Cloud and Distributed Data Storage: S3, ADLS, HDFS, GCS, Kudu, ElasticSearch/Solr, Cassandra or other NoSQL storage systems
- Experience with Data integration technologies: Spark, Kafka, event/streaming, Streamsets, Matillion, Fivetran, NiFi, AWS Data Migration Services, Azure DataFactory, Informatica Intelligent Cloud Services (IICS), Google DataProc or other data integration technologies
- Experience with Multiple data sources: (e.g. queues, relational databases, files, search, API)
- Complete software development lifecycle experience: including design, documentation, implementation, testing, and deployment
- Experience with Automated data transformation and data curation: dbt , Spark, Spark streaming, automated pipelines
- Experience with Workflow Management and Orchestration: Airflow, AWS Managed Airflow, Luigi, NiFi
- Experience with Methodologies: Agile Project Management, Data Modeling (e.g. Data Vault, Kimball)
Benefits
- Remote-First Work Environment
- Casual, award-winning small-business work environment
- Collaborative culture that prizes autonomy, creativity, and transparency
- Competitive compensation, excellent benefits, 4 weeks PTO plan plus 10 Holidays (and other cool perks)
- Accelerated learning and professional development through advanced training and certifications