Summary
Join phData, a leading innovator in the modern data stack, partnering with major cloud platforms. We're a remote-first global company committed to helping enterprises solve complex data challenges. We offer a casual, exciting work environment with opportunities for autonomy and growth. We're seeking a highly experienced Solutions Architect/Data Engineer with a strong technical background and leadership skills. The ideal candidate will have extensive experience in cloud data platforms, data integration, and software development. phData provides a comprehensive benefits package including remote work, health insurance, and professional development opportunities.
Requirements
- 10+ years of experience as a hands-on Solutions Architect and/or Data Engineer
- Programming expertise in Java, Python, and/or Scala
- Experience with core cloud data platforms including Snowflake, Spark, AWS, Azure, Databricks, and GCP
- Proficiency in SQL, including writing, debugging, and optimizing SQL queries
- Client-facing written and verbal communication skills and experience
- 4-year Bachelor's degree in Computer Science or a related field
Responsibilities
- Design and implement data solutions
- Lead and mentor other engineers
- Develop end-to-end technical solutions into production, ensuring performance, security, scalability, and robust data integration
- Create and deliver detailed presentations
- Develop detailed solution documentation (e.g., POCs and roadmaps, sequence diagrams, class hierarchies, logical system views, etc.)
Preferred Qualifications
- Production experience in core data platforms: Snowflake, AWS, Azure, GCP, Hadoop, Databricks
- Experience with Cloud and Distributed Data Storage: S3, ADLS, HDFS, GCS, Kudu, ElasticSearch/Solr, Cassandra or other NoSQL storage systems
- Experience with Data integration technologies: Spark, Kafka, event/streaming, Streamsets, Matillion, Fivetran, NiFi, AWS Data Migration Services, Azure DataFactory, Informatica Intelligent Cloud Services (IICS), Google DataProc or other data integration technologies
- Experience with Multiple data sources (e.g., queues, relational databases, files, search, API)
- Complete software development lifecycle experience including design, documentation, implementation, testing, and deployment
- Experience with Automated data transformation and data curation: dbt, Spark, Spark streaming, automated pipelines
- Experience with Workflow Management and Orchestration: Airflow, AWS Managed Airflow, Luigi, NiFi
Benefits
- Remote-First Workplace
- Medical Insurance for Self & Family
- Medical Insurance for Parents
- Term Life & Personal Accident
- Wellness Allowance
- Broadband Reimbursement
- Continuous learning and growth opportunities to enhance your skills and expertise
- Paid certifications
- Professional development allowance
- Bonuses for creating company-approved content
Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.