Principal Analytics Engineer
Bluecore
Job highlights
Summary
Join Bluecore as a Principal Analytics Engineer and make a significant impact across multiple teams. This critical role contributes to Bluecore's technical success and offers high visibility. You will oversee the development of crucial infrastructure, revenue-generating products, and internal KPI strategies. As a technical leader, you will design and develop data assets, set development standards, and advocate for tech investments. You'll also mentor the data team and collaborate across engineering to enhance Bluecore's data maturity. This role is part of the DataOps team within Engineering, working with some of the world's leading retail brands to drive revenue and enhance shopper engagement. The position offers a competitive salary and benefits package.
Requirements
- 3+ years as tech lead for a high-performing data team
- 5+ years working in data pipeline orchestration & dwh design both as an architect and developer, with 3+ years using dbt. Extensive experience with complex orchestration, macros, and performance optimizations required
- 5+ years architecting & developing a cloud data warehouse supporting client-facing and internal-facing business use cases
- Advanced knowledge of Python for analytics and data workflow/automation use cases
- Expert level knowledge of SQL
- Demonstrated success scaling the “modern data stack” while managing cost and performance
- Demonstrated success implementing coding standards, review processes, and other operational process
Responsibilities
- Define and execute Bluecore’s long-term data strategy, ensuring alignment with client impact, business goals, and scaling needs
- Lead development of technically complex transform projects to create business-critical data assets
- Architect and develop complex data models and ETL/ELT, ensuring they support Bluecore’s needs for timely analytics and exports
- Lead the evaluation and integration of new technologies, such as those related to scaling data pipeline orchestration and data sharing with clients, e.g., Snowflake, use of LLMs to Bluecore analytics
- Lead performance and cost saving initiatives and manage slot spend & storage costs across analytics infrastructure
- Maintain thought leadership in the market through blog posts, speaking engagements, and/or social media
- Own commercial relationship with dbt, and support commercial relationship with GCP (BigQuery usage)
- Mentor a team of high-performing analytics engineers through regular code reviews, technical workshops, and personalized feedback
- Develop and implement a structured onboarding program on Analytics Engineering/our data warehouse for new team members, incorporating hands-on projects with dbt to accelerate their integration into the team
- Support the growth of product analysts & BI staff to become stronger technical contributors
- Act as liaison to other parts of Bluecore’s engineering functions, developing working knowledge of adjacent infrastructure e.g., Terraform
- Enhance collaboration with engineering teams to ensure that data quality, maturity, and extensibility are integral to the software development lifecycle
- Own the development standards, process, and documentation policy for our analytics data orchestration, dbt instance, and data warehouse, including creation and application of style guides
- Ensure data quality - maintain and improve data testing and pipeline observability investments that preserve data quality and enable quick issue discovery and resolution
- Implement a centralized data catalog with technical and business metadata, ensuring all stakeholders can easily discover and understand available data, its lineage, critical business logic and usage constraints
- Oversee the development of a disaster recovery plan, including regular backups and failover mechanisms, to guarantee business continuity in the event of data infrastructure failures
- Lead hardening efforts for dbt, ensuring availability, performance and cost containment over the critical periods (e.g., Thanksgiving holiday weekend)
- Maintain data privacy and security policies to all DataOps infrastructure
Preferred Qualifications
Working knowledge of adjacent technologies, e.g., Terraform, NoSQL databases, Airflow
Benefits
- $200,000-$240,000 + stock option equity
- Equity
- Perks & benefits
- Development opportunities
- Remote first organization with the option to potentially work in our New York headquarters on occasion moving forward
Share this job:
Similar Remote Jobs
- 📍United States
- 📍India
- 📍India
- 📍India
- 💰$150k-$200k📍Canada, United States
- 📍Ecuador
- 💰$275k-$330k📍United States
- 💰$190k-$230k📍Worldwide
- 📍India