Remote Senior Data Engineer
closedMosaic.tech
π΅ $160k-$190k
πRemote - United States
Job highlights
Summary
Join Mosaic, a leading company providing financial planning and business performance solutions, as a Senior Data Engineer. Collaborate with stakeholders, design scalable data pipelines, and optimize data infrastructure.
Requirements
- Strong communication and collaboration skills, with the ability to work effectively in a distributed team across various time zones
- Demonstrated ability to manage data projects from start to finish, effectively negotiating requirements and deliverables with key stakeholders
- 5+ years of experience in data engineering or a related field working with data in a high-volume environment
- Proficiency in programming languages such as Python, Java, or Scala
- Extensive experience with SQL and database technologies (e.g., PostgreSQL, MySQL, Oracle)
- Familiarity with data orchestration tools (e.g. Apache Airflow), data transformation tools (e.g. Spark), dimensional modeling (e.g. star schema), metadata, indexing, dependencies, and data workflows to support data analytics and data science
- Experience with big data technologies (e.g., Hadoop, Spark, Kafka) and cloud platforms (e.g., AWS, Azure, Google Cloud)
- Familiarity with data warehousing solutions (e.g., Redshift, BigQuery, Snowflake)
- Solid understanding of data modeling, data architecture, and database design principles
- Excellent problem-solving skills and attention to detail
- Bachelorβs degree in Computer Science, Engineering, Information Technology, or a related field
Responsibilities
- Design, develop, and maintain scalable and efficient data pipelines to process large volumes of data from various sources
- Collaborate with stakeholders and other backend engineers to understand data requirements and deliver high-quality data solutions
- Optimize and maintain data infrastructure, ensuring reliability, scalability, and performance
- Implement best practices for data management, including data governance and data quality
- Develop and maintain ETL processes to integrate data from multiple heterogeneous sources into a unified data warehouse
- Monitor and troubleshoot data pipeline issues, ensuring data integrity and availability
Preferred Qualifications
- Subject Matter Expertise (SME) on data structure and datasets in the Financial Planning and Analysis space
- Experience with containerization and orchestration tools (e.g., Docker, Kubernetes)
- Experience in a fast-paced, agile development environment
- Understanding of data lake and data lakehouse architectures and Delta Lake or Apache Iceberg table formats
Benefits
- $160,000 - $190,000 a year
- Stock options, benefits, and additional opportunities for incentives and bonuses for performance beyond goals
This job is filled or no longer available
Similar Remote Jobs
- πUnited States
- π°$52kπSlovak Republic
- πCzechia
- πUnited States
- πUnited States
- π°$175k-$210kπUnited States, Worldwide
- πIndia
- π°$225k-$255kπUnited States
- πMexico
- πUnited States