uShip is hiring a
Data Engineer

closed
Logo of uShip

uShip

πŸ’΅ $110k-$125k
πŸ“Remote - United States

Summary

The job is for a Data Engineer at uShip who will collaborate with stakeholders, maintain the Customer Data Platform, design data strategies, improve processes, enhance data quality, resolve technical issues, build analytics tools, and source data for modeling. The position requires specific experience in database technologies, Tealium or similar CDP, Python, T-SQL, data warehouse modeling, ELT/ETL tools, and data mining techniques.

Requirements

  • 3+ years of experience in the following database technologies: SQL Server, DynamoDB, Snowflake, MongoDB, and PostgreSQL
  • 2+ years of hands-on experience with Tealium or similar Customer Data Platform
  • 3+ years of experience in Python and T-SQL is required
  • 2+ years with data warehouse modeling approaches such as Star or Snowflake schema design, and denormalizing highly transactional datasets
  • 2+ years of experience with ELT/ETL tools such as FiveTran, Stitch, AWS Glue, Pentaho, dbt etc. as well as data mining, and segmentation techniques

Responsibilities

  • Collaborate with internal and external stakeholders to understand and help determine and fully develop acceptance criteria to help build and drive our data strategy
  • Customer Data Platform maintenance with Tealium or similar product
  • Work with the data engineering team to design, maintain and execute on our vision for uShip’s Data Lake, Data Warehouse, and Data Exchange
  • Partner with team to improve existing processes and develop new ways to increase efficiency, building out automation whenever possible
  • Participate in developing effective ways to both enhance data quality and reliability
  • Resolve data-related technical issues and support data infrastructure needs for internal or external stakeholders
  • Assist in building out the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources into our Data Warehouse using technologies such AWS, orchestration and replication tooling, dbt, Snowflake, etc
  • Maintain existing legacy transformation processes in Pentaho, SSMS and SSIS
  • Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency, and other key business performance metrics to support our reporting and analytics team
  • Source, collect and prepare data for prescriptive or predictive modeling

Preferred Qualifications

  • Experience supporting and working with cross-functional teams in a dynamic environment
  • Strong organizational skills
  • Understanding the basics of distributed systems
  • Knowledge of algorithms and data structures
  • Awareness of data governance and data security principles

Benefits

  • Remote or hybrid work options
  • Monthly Wellness Reimbursements
  • Home office Reimbursements
  • Company paid meal delivery pass, plus monthly credit
  • 100% Paid Health and Dental available
  • 401(k) matching, no vesting
  • Stock Options
  • Pet Insurance
  • Dog-friendly downtown office
This job is filled or no longer available

Similar Jobs