Etl Developer

Vericast
π΅ $75k-$95k
πRemote - United States
Please let Vericast know you found this job on JobsCollider. Thanks! π
Summary
Join Vericast's Data Engineering team as a motivated ETL Developer IV! You will build big data pipelines and marketing campaign automations, supporting analytics and reporting data needs. Expand our data infrastructure and optimize data flow to our Data Lake environment. Collaborate with analytics, reporting, and data science teams on data initiatives. Ensure optimal data delivery architecture remains consistent and available. This role requires strong self-direction and the ability to support multiple teams, systems, and products. Contribute your skills to building high-quality data-driven marketing products in an agile team environment.
Requirements
- Hold a Bachelor of Technology in Computer Science/Information Technology or a related field with 4+ years of experience OR a Master of Science in Computer Science/Information Technology or a related field
- Possess 4+ years of experience in a Data Engineering or ETL Development role
- Demonstrate strong experience with SQL development, tuning, and debugging, and working familiarity with various RDBMS
- Have experience with PySpark and Python for building data pipelines
- Possess hands-on experience with Data Lakehouse and Data warehouse
- Have solid programming skills in object-oriented/functional scripting languages like Python and PySpark for building data pipelines, with experience in testing and logging to ensure code and data observability
- Be proficient in querying relational databases, query optimization, and performance tuning
- Have experience in building Data Processing pipelines using ETL tools like Talend, SSIS, etc
- Have experience with Agile Software Development methodologies
- Have experience with GitLab and CI/CD processes
Responsibilities
- Develop scalable data pipelines and build new integrations to support increasing data volume and complexity
- Collaborate with analytics and business teams to improve data models feeding business intelligence tools, increasing data accessibility, and fostering data-driven decision-making
- Implement processes and systems to monitor data quality, ensuring availability and accuracy of production data for key stakeholders and business processes
- Perform data analysis to troubleshoot data-related issues and assist in resolving them
- Provide post-deployment support and quickly respond to and resolve unexpected service problems in production
- Work closely with all business units and engineering teams to develop a strategy for long-term data platform architecture
Preferred Qualifications
- Have hands-on experience with Iceberg, Hive, S3, and Trino
- Have hands-on experience with Talend, Red Point, or other ETL technologies
- Be proficient in data visualization tools like Tableau and matplotlib
- Have AWS cloud experience with Redshift, Lambda, SageMaker, and Glue
- Possess excellent data analytical, conceptual, and problem-solving skills
- Possess excellent communication skills to promote cross-team collaboration
Benefits
- Medical, dental, and vision coverage
- 401K with company match
- Generous PTO allowance
- Life insurance
- Employee assistance
- Pet insurance
Share this job:
Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.
Similar Remote Jobs

πBelgium
πUnited States
π°$60k-$120k
πUnited States

π°$42k-$60k
πWorldwide
π°$120k-$150k
πUnited States
πUnited States
π°$144k-$189k
πWorldwide
πSouth Africa