Senior Data Engineer

Blue Orange Digital Logo

Blue Orange Digital

πŸ’΅ $150k-$170k
πŸ“Remote - United States

Summary

Join Blue Orange Digital as a Sr. Data Engineer and work with a major enterprise client in the global supply chain and logistics industry. This independent contractor role requires a multidisciplinary, hands-on approach to software and data engineering. You will build, maintain, and enhance data ingestion, models, orchestrations, and validation tests. Mastering data flows and code is crucial for improving the platform's functionality. You will achieve information dominance across microservices and data flows in AWS. Collaborate with the team to evolve the data architecture and embrace Agile methodologies for continuous value delivery. The ideal candidate possesses extensive experience in data platform development and AWS, along with advanced skills in Python, SQL, and Bash scripting.

Requirements

  • BA/BS degree in Computer Science or a related technical field, or equivalent practical experience
  • At least 7 years experience building and supporting data platforms; exposure to data technologies, eg. RDS, DynamoDB, Redshift, EMR, Glue, kafka, kinesis, MSK, Data Pipeline, Lake Formation, dbt, Airflow, Spark, etc
  • Experience with AWS and exposure to other cloud data platforms, like ADF, Azure Fabric, Snowflake, Databricks, etc
  • Advanced level Python, SQL, and Bash scripting
  • Experience designing and building robust CI/CD pipelines
  • Comfortable with Docker, configuration management, and monitoring tools
  • Knowledge of best practices related to security, performance, and disaster recovery
  • Excellent verbal and written English communication
  • Interacts with others using sound judgment, good humor, and consistent fairness in a fast-paced environment
  • The ability to maintain poise, efficiency and effectiveness in fast-paced, sometimes frenetic, high-stakes environments

Responsibilities

  • Work fast with client experts and stakeholders to learn the existing data flows, code bases, infrastructure, operations, log
  • Build, maintain, and data ingestions, data models, orchestrations, transformations and validation tests
  • Quickly master data flows and sets and code to rapidly position yourself to begin enhancing and advancing the platform as a whole, both functionally and non-functionally
  • Quickly achieve information dominance and operational prowess across all microservices, code bases and data flows supporting the platform in AWS
  • Evolve the data architecture in collaboration with the existing Team to take on adjacent platform missions and volumes
  • Be and stay professionally and aggressively curious about the platform with colleagues, the code and its data
  • Work in an extreme delivery Agile mode to constantly deliver value for our clients

Preferred Qualifications

  • 12+ years of experience in a data engineering role, with experience in ETL, data warehousing, data lakes, lakehouses, pipelines, modeling, and data quality validation
  • Expert experience with data ingestion, modeling and conformance/compliance validation
  • Expert-level skills with SQL, statistical analysis and data validation
  • Experience with GCP, Azure, Snowflake, Oracle, etc
  • BA or BS degree in a technical or quantitative field (e.g.: computer science, statistics)
  • Excellent verbal and written English communication
  • Experience in the logistic services industry a plus
  • R, Python, Scala, SPSS, Teradata, SAS, PowerBI, Tableau, Looker pluses
  • Certifications in AWS, Azure DevOps, Azure Data Fundamentals, Databricks, Snowflake

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.