Senior Data Engineer I

Dynamo Technologies Logo

Dynamo Technologies

πŸ“Remote - Worldwide

Summary

Join Dynamo Technologies as a Senior Data Engineer and contribute to building scalable data solutions, supporting data pipelines, and collaborating with cross-functional teams to develop efficient data management strategies. This role requires extensive experience in data engineering, data operations, data analytics, and data management, along with hands-on experience with Python or R for data analytics, strong SQL query development skills, and expertise in cloud environments such as AWS. You will be responsible for designing and implementing data storage and processing infrastructure, developing and maintaining robust data solutions, supporting ETL processes, and implementing DevOps practices. The ideal candidate will also have experience in application development using Java, Python, Rust, or Scala, knowledge of DevOps practices, and a strong understanding of database instrumentation and automation.

Requirements

  • Proficiency in Python or R for data operations and analytics
  • Strong expertise in SQL development with enterprise databases such as PostgreSQL or SQL Server
  • Experience in Linux/UNIX server environments as well as Windows
  • Hands-on experience with ETL development and data pipeline operations
  • Familiarity with cloud-based solutions such as AWS
  • Experience in application development using Java, Python, Rust, or Scala
  • Knowledge of DevOps practices and automation in data environments
  • Strong understanding of database instrumentation and automation
  • Experience developing line of business applications using Salesforce and SharePoint
  • U.S Citizenship required
  • Minimum of six (6) years of relevant experience in data engineering, analytics, and data management

Responsibilities

  • Design and implement data storage and processing infrastructure to manage large-scale data analytics
  • Develop and maintain robust and scalable solutions for managing structured and unstructured data using traditional databases (PostgreSQL, SQL Server), Massively Parallel Processing (MPP) databases (Redshift), and NoSQL technologies (Hadoop, Spark)
  • Develop complex SQL queries and stored procedures/functions for data manipulation and transformation
  • Support ETL (Extract, Transform, Load) processes, ensuring data quality and integrity
  • Implement DevOps practices and automation techniques to optimize data operations
  • Orchestrate server environments using tools such as Puppet and Ansible
  • Collaborate with data scientists, database architects, and business users to develop data processing use cases
  • Develop and document service APIs for data collection, cleansing, and storage
  • Work with business intelligence tools to develop data visualization solutions
  • Ensure the complete Software Development Lifecycle (SDLC) process, including writing requirements, implementation, testing, documentation, and deployment of data solutions
  • Troubleshoot and optimize database performance and configurations
  • Support Microsoft Access frontend development and integration

Preferred Qualifications

  • Experience in database configuration, troubleshooting, and optimization
  • Experience in developing and deploying APIs for data integration and processing
  • Knowledge of data warehouse development and operations
  • Familiarity with tools such as SQL PowerArchitect for database and data model development
  • Strong problem-solving skills and the ability to work in a fast-paced environment
  • Bachelors or advanced degree in Information Systems, Computer Science, Data Science, Mathematics, Statistics, Operations Management, Engineering, or a related field preferred

Benefits

Remote Salary Range $125 β€” $135 USD

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.