Remote Data Engineer

Logo of Dynatron Software, Inc.

Dynatron Software, Inc.

πŸ’΅ $140k-$150k
πŸ“Remote - Worldwide

Job highlights

Summary

Join our team at Dynatron Software as a Data Engineering Expert and contribute to the success of our SaaS and DaaS product portfolio. As a critical member of our team, you will be responsible for ensuring data pipelines, data warehouse, and overall data quality are rock-solid. You must have expertise in Snowflake, Python, SQL, ETL/ELT platforms, Airflow/MWAA, DBT, Infrastructure-as-Code platforms, and MySQL databases.

Requirements

  • Expert with Snowflake
  • Expert in creating data pipelines using Python and SQL
  • Experience with ETL/ELT platforms such as Airbyte and No-Code/Low-Code platforms
  • Experience with Airflow/MWAA including developing DAGs and integrating with CI/CD pipeline
  • Experience with DBT including developing models, validation, and deployment
  • Experience with Infrastructure-as-Code platforms such as Terraform
  • Ability to understand and support pipelines written in PHP
  • Experience working with MySQL databases (Vitess and sharded databases would be a plus)
  • Extensive experience with approaches to ensure data quality throughout the data pipelines and in the data warehouse and addressing data quality issues with the current architecture
  • Collaborate closely with cross-functional teams, including product owners, software engineers, data scientists and business stakeholders, to understand data requirements and deliver solutions that meet evolving business needs
  • Lead by example, demonstrating a commitment to excellence, integrity, and professionalism in all aspects of your work
  • Bachelor's degree in Computer Science, Engineering, or a related field
  • 5+ years of experience in designing, implementing, and maintaining relational/data warehousing environments
  • Experience with Big Data technologies such as Hadoop, Spark, Beam, Flink, or similar technologies is a plus
  • Excellent problem-solving skills with a strong attention to detail
  • Ability to work both independently and as part of a team
  • Excellent verbal and written communication skills

Responsibilities

  • Establishing data architecture
  • Developing data pipelines
  • Administration of the platform
  • Creating data pipelines using Python and SQL
  • Experience with ETL/ELT platforms such as Airbyte and No-Code/Low-Code platforms
  • Experience with Airflow/MWAA including developing DAGs and integrating with CI/CD pipeline
  • Experience with DBT including developing models, validation, and deployment
  • Experience with Infrastructure-as-Code platforms such as Terraform
  • Ability to understand and support pipelines written in PHP
  • Experience working with MySQL databases (Vitess and sharded databases would be a plus)

Benefits

  • Health insurance
  • Dental insurance
  • Vision insurance
  • Short term disability insurance
  • Stock options
  • Work from home and flexible scheduling depending on job requirements
  • Professional development opportunities
  • 9 paid holidays
  • 15 days PTO
  • Home office setup support for remote employees

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.
Please let Dynatron Software, Inc. know you found this job on JobsCollider. Thanks! πŸ™