Senior Data Specialist

Logo of Brillio

Brillio

πŸ’΅ $120k-$125k
πŸ“Remote - United States

Job highlights

Summary

Join Brillio as a Data Specialist and work remotely. This role requires 6+ years of IT experience with strong Python and advanced SQL skills. You will design, develop, and maintain DBT models and data pipelines. Experience with Unix Shell scripting and data analytics implementations (Data Lake and Data Warehouse) is mandatory. The position offers a competitive salary between $120,000 and $125,000 annually. Brillio is an equal opportunity employer committed to fostering a diverse and inclusive workplace.

Requirements

  • Have 6 plus years of overall IT experience
  • Possess strong Python experience
  • Write effective, scalable code in Python
  • Have hands-on experience with Python - Pandas library in detail & Numpy
  • Possess advanced SQL skills
  • Be able to write complex SQL queries to query large amounts of data
  • Have hands-on DBT development experience
  • Design, develop, and maintain DBT models, transformations, and SQL code to build efficient data pipelines for analytics and reporting
  • Have mandatory hands-on coding experience in Unix Shell scripting
  • Develop back-end components to improve responsiveness and overall performance
  • Have hands-on experience in designing and building data pipelines in data analytics implementations such as Data Lake and Data Warehouse and development
  • Integrate user-facing elements into applications
  • Write Snowflake SQL queries against Snowflake
  • Develop scripts like Unix, Python, etc. to do Extract, Load, and Transform data
  • Test and debug programs
  • Improve functionality of existing systems
  • Implement security and data protection solutions
  • Coordinate with internal teams to understand user requirements and provide technical solutions
  • Support QA, UAT and performance testing phases of the development cycle
  • Understand and incorporate the required security framework in the developed data model and ETL objects
  • Define standards and procedures; refine methods and techniques for data extraction, transformation and loading (ETL) both in batch and β€œnear real time” modes
  • Design Data Integration, Data Warehousing, Analytics, Reporting & Data science strategies

Preferred Qualifications

  • Have Salesforce CDP knowledge and Snowflake implementation
  • Have working experience with Airflow
  • Have exposure to AWS eco systems

Benefits

  • $120,000 - $125,000 a year
  • Remote work

Share this job:

Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.

Similar Remote Jobs

Please let Brillio know you found this job on JobsCollider. Thanks! πŸ™