πUnited States
Data Architect

Brillio
πRemote - United States
Please let Brillio know you found this job on JobsCollider. Thanks! π
Summary
Join Brillio, a rapidly growing digital technology services provider, as a Data Architect. This remote role requires 8+ years of experience and a strong background in data warehousing, ETL processes, and DBT. You will design, develop, and maintain DBT models and SQL code for efficient data pipelines. Proficiency in Python, including Pandas and NumPy, is essential, along with advanced SQL skills and Unix Shell scripting. Experience with Snowflake and Salesforce CDP is a plus. Brillio offers a collaborative environment and opportunities to work on cutting-edge projects.
Requirements
- Role: Data Architect
- Years of Experience: 8+ years
- 9 plus years of overall IT experience
- Proven experience with DBT (Data Build Tool), including model development, transformations, and testing
- Hands on DBT development experience
- Strong Python experience mandatory
- Hands on experience with Python - Pandas library in detail & Numpy
- Advanced SQL skills are mandatory
- Ability to write complex SQL queries to query large amounts of data
- Mandatory hands on coding experience in Unix Shell scripting
Responsibilities
- Design, develop, and maintain DBT models, transformations, and SQL code to build efficient data pipelines for analytics and reporting
- Design, build, and maintain robust ETL processes and data pipelines using DBT and other relevant tools
- Optimize ETL pipelines for performance, scalability, and cost-efficiency
- Write effective, scalable code in Python
- Troubleshoot and resolve any issues related to the ETL process or data pipeline performance
- Integrate user-facing elements into applications
- Writing Snowflake SQL queries against Snowflake
- Developing scripts like Unix, Python, etc. to do Extract, Load, and Transform data
- Coordinate with internal teams to understand user requirements and provide technical solutions
Preferred Qualifications
- Salesforce CDP knowledge and Snowflake implementation will be a plus
- Working experience with Airflow, or other data pipeline orchestration tools is a plus
- Knowledge of cloud platforms (AWS, GCP, Azure) and experience with cloud-based data storage solutions (e.g., S3, GCS, Blob Storage)
Share this job:
Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.
Similar Remote Jobs
πGermany
π°$182k
πUnited States
πUnited States
πGermany
π°$140k-$175k
πUnited States
π°$140k-$175k
πUnited States
π°$140k-$175k
πUnited States
π°$175k-$200k
πUnited States
πWorldwide