Remote Finance Analyst

closed
Logo of HashiCorp

HashiCorp

πŸ“Remote - Canada

Job highlights

Summary

Join our Data Analytics & Engineering organization as a mid-level engineer to enable HashiCorp to leverage data as a strategic asset by providing reliable, scalable, and efficient data solutions.

Requirements

  • Minimum 3 years of experience as a Finance Data Analyst
  • At Least 2+ years of experience working with financial data sets
  • Worked on preparing finance reports like balance sheet, Income statement, pipelines and billing metrics
  • At least 2+ years of experience using Netsuite, Coupa, Stripe and other financial platform
  • Able to help and support Finance and other Accounting team by converting there business use-case into sql and eventually build financial reports
  • Adhere to data privacy and security protocols to protect sensitive financial data
  • Experience in developing and deploying data pipelines, preferably in the Cloud
  • Minimum 2 years of experience with Snowflake-Snowflake SQL, Snow pipe, streams, Stored procedure, Task, Hashing, Row Level Security, Time Travel etc
  • Strong written and oral communication skills with the ability to synthesize, simplify and explain complex problems to different audiences

Responsibilities

  • Securing financial data in a data warehouse and retrieving data from financial sources
  • Building analytics and automating transformation to support and partner with the Finance team
  • Oversee and govern the expansion of existing data architecture and the optimization of data query performance via best practices
  • Develops and maintains scalable data pipelines and builds out new API integrations to support continuing increases in data volume and complexity
  • Collaborates with analytics and business teams to improve data models that feed business intelligence tools, increasing data accessibility and fostering data-driven decision making across the organization
  • Implements processes and systems to monitor data quality, ensuring production data is always accurate and available for key stakeholders and business processes that depend on it
  • Performs data analysis required to troubleshoot data related issues and assist in the resolution of data issues
  • Develop best practices for data structure to ensure consistency within the system

Preferred Qualifications

  • Hands on experience with Snowpark and App development with Snowpark and Stream lit
  • Strong experience in ETL or ELT Data Pipelines and various aspects, terminologies with Pure SQL like SCD Dimensions, Delta Processing e
  • Working with AWS cloud services - S3, Lambda, Glue, Athena, IAM, CloudWatch
  • Experience in creating pipelines for real time and near real time integration working with different data sources - flat files, XML, JSON, Avro files and databases
  • 2 years of experience in Python programming language to write maintainable, reusable, and complex functions for backend data processing
This job is filled or no longer available