Summary
Join Apply Digital, a high-growth digital experience agency, as a Senior Data Engineer. You will collaborate with cross-functional teams to design, build, and optimize data pipelines using Python and SQL. Leveraging cloud data platforms, you'll develop and maintain data warehouse schemas and analytics solutions. This role requires a Bachelor's degree, 5+ years of experience in data engineering, and strong expertise in SQL, ETL, and data warehousing technologies. The ideal candidate will possess excellent communication skills and a passion for technology. Apply Digital offers a hybrid/remote-friendly work environment, generous benefits, and ample learning opportunities.
Requirements
- A Bachelor's degree in Computer Science, Information Systems, or a related field
- At least 5 years of experience in data engineering or related fields
- Strong expertise in SQL, ETL, and data warehousing technologies
- Strong working knowledge of Python and PySpark
- Proven experience with at least one major cloud data platform (AWS Athena, Databricks, Snowflake, Azure Synapse/Fabric, or Google BigQuery)
- Knowledge of data migration tools and techniques
- Good knowledge of databases
- Strong background in data warehouse design, ETL/ELT processes, and data modeling, ideally with modern data warehouse and DBT
- Familiarity with cloud engineering concepts and best practices
- Experience working with large datasets and optimizing performance
- Excellent problem-solving skills and attention to detail
- Outstanding communication skills in English, both written and verbal
Responsibilities
- Design, build, and optimize ETL/ELT pipelines using Python and SQL
- Develop and maintain data warehouse schemas and analytics solutions
- Implement data models and ensure data quality, consistency, and integrity
- Leverage cloud data platforms (e.g., AWS Athena, Databricks, Snowflake, Azure Synapse/Fabric, Google BigQuery) for storage, processing, and querying
- Create and maintain optimal data pipeline architecture
- Collaborate with analysts and stakeholders to define requirements and deliver data solutions that meet business needs
- Communicate with stakeholders to understand data requirements and develop solutions
- Design, develop, implement, and maintain data architectures and pipelines
- Optimize query performance and ensure efficient data workflows
- Document data pipelines, architectures, and processes
- Continuously improve and refine existing data infrastructure and solution
- Develop and implement best practices for data management, security, and privacy
Preferred Qualifications
- Knowledge of multiple cloud platforms (AWS, Azure, GCP) and their data services
- Familiarity with data governance and compliance requirements
- Knowledge of CI/CD practices for data engineering
- Familiarity with Agile development methodologies
- Experience with containerization (Docker, Kubernetes)
Benefits
- Generous training budgets, including partner tech certifications, custom learning plans, workshops, mentorship, and peer support
- Generous vacation policy
- Customizable benefits: Tailor your extended health and dental plan to your needs, priorities, and preferences
- Flexible work arrangements: We work in a variety of ways, from remote, to in-office, to a blend of both