Lead Data Engineer

Aimpoint Digital
Summary
Join Aimpoint Digital, a fully remote data and analytics consultancy, as a Lead Data Engineer. You will work with clients across various industries to design and develop end-to-end analytical solutions, improving their data-driven insights. Responsibilities include becoming a trusted advisor to clients, managing small teams to solve complex data engineering use cases, designing and developing analytical layers, and working with modern tools like Snowflake and Databricks. You will write code in SQL, Python, and Spark, and utilize software engineering best practices. The role requires strong communication skills, extensive experience with relational databases and data pipelines, and expertise in data modeling and software engineering. The ideal candidate will also possess experience with cloud data warehouses, ETL/ELT tools, and cloud platforms.
Requirements
- Degree educated in Computer Science, Engineering, Mathematics, or equivalent experience
- Experience with managing stakeholders and collaborating with customers
- Strong written and verbal communication skills required
- 5+ years working with relational databases and query languages
- 5+ years building data pipelines in production and ability to work across structured, semi-structured and unstructured data
- 5+ years data modeling (e.g. star schema, entity-relationship)
- 5+ years writing clean, maintainable, and robust code in Python, Scala, Java, or similar coding languages
- Ability to manage an individual workstream independently and ability to manage a small 1-2 person team
- Expertise in software engineering concepts and best practices
Responsibilities
- Become a trusted advisor working together with our clients, from data owners and analytic users to C-level executives
- Work independently or manage small teams to solve complex data engineering use-cases across a variety of industries
- Design and develop the analytical layer, building cloud data warehouses, data lakes, ETL/ELT pipelines, and orchestration jobs
- Work with modern tools such as Snowflake, Databricks, Fivetran, and dbt and credentialize your skills with certifications
- Write code in SQL, Python, and Spark, and use software engineering best-practices such as Git and CI/CD
- Support the deployment of data science and ML projects into production
Preferred Qualifications
- DevOps experience
- Experience working with cloud data warehouses (Snowflake, Google BigQuery, AWS Redshift, Microsoft Synapse)
- Experience working with cloud ETL/ELT tools (Fivetran, dbt, Matillion, Informatica, Talend, etc.)
- Experience working with cloud platforms (AWS, Azure, GCP) and container technologies (Docker, Kubernetes)
- Experience working with Apache Spark
- Experience preparing data for analytics and following a data science workflow to drive business results
- Consulting experience
- Willingness to travel
Share this job:
Similar Remote Jobs




