Summary
Join phData, a leading innovator in the modern data stack, partnering with major cloud data platforms. We are a remote-first global company committed to helping global enterprises overcome their toughest data challenges. We offer a casual, exciting work environment where top performers have autonomy to deliver results. We're looking for a hands-on Data Engineer with 8+ years of experience designing and implementing data solutions. The ideal candidate will have expertise in Java, Python, or Scala, core cloud data platforms, and SQL. phData is an award-winning workplace with a commitment to diversity and inclusion.
Requirements
- Demonstrate programming expertise in Java, Python, and/or Scala
- Show proficiency in core cloud data platforms including Snowflake, AWS, Azure, Databricks, and GCP
- Write, debug, and optimize SQL queries
- Possess client-facing written and verbal communication skills and experience
- Create and deliver detailed presentations
- Produce detailed solution documentation (e.g., POCs, roadmaps, sequence diagrams, class hierarchies, logical system views, etc.)
- Hold a 4-year Bachelor's degree in Computer Science or a related field
Responsibilities
- Serve as a hands-on Data Engineer, designing and implementing data solutions
- Lead and/or mentor other engineers
- Develop end-to-end technical solutions into production, ensuring performance, security, scalability, and robust data integration
Preferred Qualifications
- Possess production experience in core data platforms: Snowflake, AWS, Azure, GCP, Hadoop, Databricks
- Demonstrate experience with Cloud and Distributed Data Storage: S3, ADLS, HDFS, GCS, Kudu, ElasticSearch/Solr, Cassandra, or other NoSQL storage systems
- Show expertise in Data integration technologies: Spark, Kafka, event/streaming, Streamsets, Matillion, Fivetran, NiFi, AWS Data Migration Services, Azure DataFactory, Informatica Intelligent Cloud Services (IICS), Google DataProc, or other data integration technologies
- Have experience with Multiple data sources (e.g., queues, relational databases, files, search, API)
- Possess complete software development lifecycle experience including design, documentation, implementation, testing, and deployment
- Demonstrate experience with Automated data transformation and data curation: dbt, Spark, Spark streaming, automated pipelines
- Show experience with Workflow Management and Orchestration: Airflow, AWS Managed Airflow, Luigi, NiFi
Benefits
- Remote-First Workplace
- Medical Insurance for Self & Family
- Medical Insurance for Parents
- Term Life & Personal Accident
- Wellness Allowance
- Broadband Reimbursement
- Continuous learning and growth opportunities to enhance your skills and expertise
- Paid certifications
- Professional development allowance
- Bonuses for creating for company-approved content
Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.