πUnited Kingdom, Portugal
Data Engineer
BPM
π΅ $105k-$140k
πRemote - United States
Please let BPM know you found this job on JobsCollider. Thanks! π
Summary
Join the Business Insights and Analytics team in the Business Transformation Office as a Data Engineer for a 12-month contract with the potential for a permanent role. You will build, maintain, and govern data pipelines using Azure and Databricks. Responsibilities include developing data pipelines to the data lakehouse, ensuring data quality and security, and collaborating with stakeholders. The ideal candidate possesses strong data engineering skills, experience with big data technologies, and excellent communication abilities. This fully remote position offers a competitive salary and flexible scheduling.
Requirements
- Possess an undergraduate degree in data or computer science, information technology, statistics, or mathematics
- Have a minimum of 2 years of work experience as a Data Engineer working in a Databricks environment with specific expertise in Databricks Delta Lake, notebooks, and clusters
- Have experience with Data Vault Modeling
- Possess knowledge of big data technologies and tools, such as Hadoop, Spark, and Kafka
- Have a strong understanding of relational data structures, theories, principles, and practices
- Possess proficiency in programming languages such as Python and SQL
- Have a strong understanding of data modeling, algorithms, and data transformation strategies and techniques for data science consumption
- Possess proven experience with business and technical requirements analysis, elicitation, modeling, verification, and methodology development
- Have the ability to create systematic and consistent requirements specifications in both technical and user-friendly language
- Possess excellent critical thinking skills and the ability to understand the relationships between data and business intelligence
Responsibilities
- Develop, deploy, and support high-quality, fault-tolerant data pipelines
- Build the infrastructure required for optimal extraction, loading, and transformation of data from a wide variety of data sources using best practices
- Support the architecture for observing, cataloging, and governing data
- Develop ETL functionality through coding (Python, DbT, SQL)
- Build and optimize ETL
- Monitor and troubleshoot ELT processes to ensure data accuracy and reliability
- Monitor and analyze performance metrics and recommend improvements
- Ensure security and integrity of data
- Implement data governance and access controls to ensure data security and compliance
- Collaborate with the security team to implement encryption, authentication, and authorization mechanisms
- Monitor and audit data access to maintain data privacy and integrity
- Collaborate with the team as well as key stakeholders across the organization
- Understand the need to profile the data source to effectively send the data to the Lakehouse
- Utilize development best practices including technical design reviews, implementing test plans, monitoring/alerting, peer code reviews, and documentation
- Communicate with both technical and business end-user colleagues to deliver meaningful outcomes
- Work with the Information Technology Department in partnership
Benefits
- Fully remote position
- Ability to work flexible schedule (am, pm & weekends as needed)
- Salary: $105,000 - $140,000 a year
Share this job:
Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.
Similar Remote Jobs
π°$220k-$270k
πUnited States
πKingdom of Saudi Arabia
π°$175k-$210k
πUnited States, Worldwide
πIndia
πIndia
π°$225k-$255k
πUnited States
πMexico
π°$52k
πSlovak Republic
πCzechia