Remote Data Manager

Logo of Bounteous

Bounteous

πŸ“Remote - India

Job highlights

Summary

Join our team as an Architect and help fortune 500 clients succeed by leveraging their data to make smarter decisions. Work closely with clients and key business stakeholders to understand use cases, objectives, and KPIs they want to track/optimize using data and analytics.

Requirements

  • Should have working experience in one of RDBMS data stores like Oracle, MySQL, Redshift and one of NoSQL data stores like HBase, Mongo, Cassandra etc
  • Strong ETL skills and expertise to build Data Services in Amazon Web Services
  • Able to build end-applications using Nifi, Kafka, etc
  • Implement and support efficient reliable data pipelines to move data from a wide variety of data sources to data marts / data lake
  • Implement data aggregation, cleansing and transformation layers
  • Ability to build Data Ingestion frameworks taking into security, account access patterns, scalability, response time and availability
  • Experience in Big data integration and stream processing technologies using Apache Kafka, Kafka Connect (Confluent), Apache NiFi, Spark, Hive
  • Experience working on writing Pub - Sub APIs, developing Kafka Streams, Kafka connect, KSQL
  • Developing new processors within Apache NiFi and establishing new data flows/troubleshooting existing data flows to the various hardware instances associated with the different data platforms
  • Experience with serialization such as JSON and/or BSON
  • Understanding / Experience of working with Hadoop Clusters
  • A strong Java/Spring application background and experience on developing in a microservices architecture is good to have
  • Experience in Reporting Analytics tools is also good to have

Responsibilities

  • Play an architect role in helping our fortune 500 clients succeed by leveraging their data to make smarter decisions
  • Work closely with clients and key business stakeholders to understand use cases, objectives and KPIs they want to track/optimize using data and analytics
  • Provide end to end solutions across data integration/pipeline using cloud native solutions, database/data warehouse architecture, data engineering - data analysis, analytics and insights etc
  • Implement core parts of solutions with the best design using technologies such as SQL, Python, RDS and Redshift on AWS, and data visualization tools
  • Dive deep into technical requirements with engineering team while also determining how best to package and present the data to all business stakeholders for the best outcome
  • Analyze customer requests and ask relevant queries on the requirements and clarify before starting development
  • You will have to be proactive in foreseeing any risks in the delivery and pre-handle it with the customer to avoid any escalation
  • Lead and mentor team of data engineers and bring them up to speed
  • Awareness on information security measures such as acceptable use of information assets, malware protection, password security
  • Understand and report security risks and how they impact the confidentiality, integrity, and availability of information assets
  • Understand how data is stored, processed, or transmitted from a Data privacy and protection standpoint

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.
Please let Bounteous know you found this job on JobsCollider. Thanks! πŸ™