Senior Software Engineer - Java & Kafka

3Pillar Global
Summary
Join 3Pillar, a leading company in the tech industry, and contribute to transformative projects that redefine various sectors like urban living, media, and healthcare. As a Senior Software Engineer, you will design and implement scalable data pipelines using Java and big data technologies, ensuring data integrity and performance. You will also be responsible for managing data pipelines, versioning, and change management, as well as developing and maintaining ETL/ELT processes for efficient data ingestion, transformation, and storage. This role involves working with relational and NoSQL databases, automating data workflows, and ensuring data accuracy and compliance. You will also be involved in training and mentoring junior engineers and collaborating with other team members to drive client success.
Requirements
- Demonstrated expertise with a minimum of 5+ years of experience as a software engineer
- Proficiency in Java 8+ and frameworks like Spring Boot with exposure to Kafka, Spark
- Proficient in data pipeline and workflow management tools like Airflow
- Advanced SQL skills and experience with relational and NoSQL databases like MySQL, and MongoDB
- Exposure to working on Data Lakes & Data Warehouses solutions
- Excellent problem-solving, communication, and organizational skills
- Proven ability to work independently and with a team
Responsibilities
- Understanding the business requirements and implementing the technical solution
- Design, development, and maintenance of scalable data pipelines using Java and big data technologies
- Creating data pipelines, versioning, and change management
- Managing the complexity inherent in versioned data pipelines
- Working with relational (SQL) and NoSQL databases, ensuring data integrity and performance
- Development, maintenance, and troubleshooting of ETL/ELT processes for efficient data ingestion, transformation, and storage
- Automating and optimizing data workflows such as data ingestion, aggregation, and ETL processing
- Designing, building, and maintaining batch or real-time data pipelines in production
- Ensure data accuracy, integrity, privacy, security, and compliance through quality control procedures
- Writing secure and scalable data exposing APIs for data consumers
- Training and mentoring of junior engineers
- Executing complex activities within the current methodology and quality standards, showcasing success across diverse engagements
- Promoting client success across the team by collaborating with engineers, designers, and managers to understand user pain points, anticipate potential problems, and iterate on solutions that drive client success
Preferred Qualifications
- Prior experience working with IOT devices would be an advantage
- Experience with Big Data technologies like Map Reduce, Hadoop, Hive, etc
- Experience with data visualization tools like PowerBI, Tableau, AWS QuickSight, etc
- Experience with any public cloud (AWS/Azure/GCP)
Benefits
- Imagine a flexible work environment β whether it's the office, your home, or a blend of both
- From interviews to onboarding, we embody a remote-first approach
- You will be part of a global team, learning from top talent around the world and across cultures, speaking English everyday
- Our global workforce enables our team to leverage global resources to accomplish our work in efficient and effective teams
- Weβre big on your well-being β as a company, we spend a whole trimester in our annual cycle focused on wellbeing
- Whether it is taking advantage of fitness offerings, mental health plans (country-dependent), or simply leveraging generous time off, we want all of our team members operating at their best
- Our professional services model enables us to accelerate career growth and development opportunities - across projects, offerings, and industries
Share this job:
Similar Remote Jobs
