
Data Engineer

phData
Summary
Join phData, a leading provider of data solutions, and contribute to their mission of helping global enterprises overcome data challenges. As a remote-first company with a global presence, phData offers a casual and collaborative work environment that values autonomy, creativity, and transparency. This role requires 1-4 years of experience as a Software Engineer, Data Engineer, or Data Analyst, with expertise in Java, Python, or Scala, core cloud data platforms, and SQL. You will be responsible for developing end-to-end technical solutions, ensuring performance, security, scalability, and robust data integration. phData provides competitive compensation, excellent benefits, generous PTO, and opportunities for professional development through advanced training and certifications.
Requirements
- Programming expertise in Java, Python and/or Scala
- Core cloud data platforms including Snowflake, AWS, Azure, Databricks and GCP
- SQL and the ability to write, debug, and optimize SQL queries
- Client-facing written and verbal communication skills and experience
- 4-year Bachelor's degree in Computer Science or a related field
- Strong English communication skills (written and verbal)
Responsibilities
- Develop end-to-end technical solutions into production β and to help ensure performance, security, scalability, and robust data integration
- Create and deliver detailed presentations
- Detailed solution documentation (e.g. including POCS and roadmaps, sequence diagrams, class hierarchies, logical system views, etc.)
Preferred Qualifications
- Production experience in core data platforms: Snowflake, AWS, Azure, GCP, Hadoop, Databricks
- Cloud and Distributed Data Storage: S3, ADLS, HDFS, GCS, Kudu, ElasticSearch/Solr, Cassandra or other NoSQL storage systems
- Data integration technologies: Spark, Kafka, event/streaming, Streamsets, Matillion, Fivetran, NiFi, AWS Data Migration Services, Azure DataFactory, Informatica Intelligent Cloud Services (IICS), Google DataProc or other data integration technologies
- Multiple data sources (e.g. queues, relational databases, files, search, API)
- Complete software development lifecycle experience including design, documentation, implementation, testing, and deployment
- Automated data transformation and data curation: dbt, Spark, Spark streaming, automated pipelines
- Workflow Management and Orchestration: Airflow, AWS Managed Airflow, Luigi, NiFi
Benefits
- Remote-First Work Environment
- Casual, award-winning small-business work environment
- Collaborative culture that prizes autonomy, creativity, and transparency
- Competitive comp, excellent benefits, generous PTO plan plus 10 Holidays (and other cool perks)
- Accelerated learning and professional development through advanced training and certifications
Share this job:
Similar Remote Jobs
