Summary
Join MeridianLink's Analytics team as an accomplished Data Engineer. Expand and improve data and data pipeline architecture, optimizing data flow and MDM for cross-functional teams. As an experienced data pipeline builder and data wrangler, you will support database architects, data analysts, and data scientists. You'll ensure optimal data delivery throughout ongoing projects and be comfortable supporting multiple teams, systems, and products. This role involves optimizing or re-designing data pipelines to support next-generation products and data initiatives. The ideal candidate is self-directed and excited by the prospect of improving data systems.
Requirements
- 2-4 years professional Data Engineering and Data warehousing experience
- Bachelor's degree is highly preferred
- Extremely strong implementation experience in Python, Parquet, Spark, Azure Databricks, Delta Lake, Databricks Data Warehouse. Databricks workflows, Delta Sharing and Unity Catalog
- SQL development knowledge β Stored procedures, triggers, jobs, indexes, partitioning, pruning etc
- Be able to write/debug complex SQL queries
- ETL/ELT and Data-warehousing techniques and best practice
- Experience building, maintaining, and scaling ETL/ELT processes and infrastructure
- Implementation experience with various data modelling techniques
- Implementation experience working with a BI visualization tool (Sisense is a plus)
- Experience with CI/CD tools (Preferred Gitlab, Jenkins)
- Experience with cloud infrastructure (Azure strongly preferred)
- Experience working in a fast-paced product environment, with an attitude of getting the job done with the least amount of tech debt
Responsibilities
- Design, develop, and operate large scale data pipelines to support internal and external consumers
- Improve and automate internal processes
- Integrate data sources to meet business requirements
- Write robust, maintainable, well documented code
Preferred Qualifications
- UI development frameworks such as java script, Django, REACT etc
- Knowledge of being able to work with a variety of Ingestion patterns such as API/SQL servers etc
- Knowledge of Master Data Management
- Prior Financial industry experience a plus
- Be able to navigate ambiguity and pivot based on business priorities with ease
- Strong communication, negotiating and estimating skills
- Be a team player and should be able to collaborate well
Benefits
- Potential For Equity-Based Awards
- Insurance coverage (medical, dental, vision, life, and disability)
- Flexible paid time off
- Paid holidays
- 401(k) plan with company match
- Remote work
Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.