๐United States
Architect

Datacom
๐Remote - New Zealand
Please let Datacom know you found this job on JobsCollider. Thanks! ๐
Summary
Join Datacom's growing analytics team as a Databricks Architect, working on diverse customer projects focused on data. Collaborate with developers, data scientists, and engineers to design and maintain data architectures enabling data-driven insights. Translate business requirements into scalable data solutions, showcasing expertise in Databricks architecture and the platform. This role offers flexibility with a location of your choice. Datacom provides a dynamic and supportive work environment with various perks. The successful candidate will have a passion for Databricks and a proven track record in data solutions development.
Requirements
- Extensive knowledge of Databricks architecture and features
- Experience in the design, development and implementation of data solutions using Databricks
- Experience in data ingestion tools such as Apache Kafka, Informatica, AWS Glue, Azure event hubs and Fivetran
- Strong programming skills, especially Python, Java, Scala, C++, C# etc
- Cloud platforms such as Azure, AWS & GCP
- Relational database management systems such as SQL Server, Redshift etc
- Distributed processing technologies such as Apache Spark
- Working knowledge of message queuing, stream processing, and highly scalable data stores
- Experience building CI/CD Pipelines such as GITHUB, Azure DevOps, etc
- Degree in Computer Science, Data Science, Statistics or related field
- Ability to connect to customerโs specific business problems and Databricks solutions
- Minimum 3+ yearsโ experience in Databricks development
- Experience developing and deploying data pipelines into live environments
- A passion for lean, clean and maintainable code
- Strong analytic skills related to working with datasets, both structured and unstructured
- Curious and enthusiastic mindset
- The desire to grow and to share insights with others
Responsibilities
- Design data platforms, distributed systems, data lakes & data stores
- Creating software solutions for data ingest & integration
- Developing and operationalising reliable data pipelines & ETL patterns
- Building analytics tools to provide actionable insights and solve business problems
- Infrastructure development
- Wrangling and integrating data from multiple sources
- Identifying ways to improve data reliability, efficiency and quality
Preferred Qualifications
- The desire to grow and to share insights with others
- Machine Learning frameworks and theory
- Analytic platforms & tools such as Databricks, Alteryx, SAS, KNIME or Datarobot
- Data vault / Kimball modelling methodology
- DevOps / DataOps
Benefits
- Remote working
- Flexi-hours
- Professional development courses
Share this job:
Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.
Similar Remote Jobs
๐Singapore
๐ฐ$93k-$114k
๐Australia
๐Worldwide
๐Worldwide
๐Worldwide
๐Germany
๐Worldwide
๐ฐ$129k-$180k
๐United States