πUnited Kingdom, Portugal
Data Engineer

Gemini
π΅ $104k-$145k
πRemote - United States
Please let Gemini know you found this job on JobsCollider. Thanks! π
Summary
Join Gemini's Data team as a Data Engineer and contribute to shaping our data approach. You will design, build, and maintain data pipelines and models, collaborating with analysts and other teams. Leverage your expertise in Python and SQL to improve data processes and drive informed business decisions. Mentor junior engineers and communicate insights to organizational leaders. This role requires extensive experience in data engineering, data warehousing, and ETL processes. Gemini offers a competitive salary, bonus, equity, comprehensive health plans, 401k matching, paid parental leave, and flexible time off.
Requirements
- 4+ years experience in data engineering with data warehouse technologies
- 4+ years experience in custom ETL design, implementation and maintenance
- 4+ years experience with schema design and dimensional data modeling
- Advanced skills with Python and SQL are a must
- Experience with one or more MPP databases(Redshift, Bigquery, Snowflake, etc)
- Experience with one or more ETL tools(Informatica, Pentaho, SSIS, Alooma, etc)
- Strong computer science fundamentals including data structures and algorithms
- Strong software engineering skills in any server side language, preferable Python
- Experienced in working collaboratively across different teams and departments
- Strong technical and business communication
Responsibilities
- Design, automate, build, and launch scalable, efficient and reliable data pipelines into production using Python
- Design, build and enhance dimensional models for Data Warehouse and BI solutions
- Research new tools and technologies to improve existing processes
- Work closely with data analysts to understand data integration and modeling requirements
- Develop new systems and tools to enable the teams to consume and understand data more intuitively
- Perform root cause analysis and resolve production and data issues
- Create test plans, test scripts and perform data validation
- Tune SQL queries, reports and ETL pipelines
- Build and maintain data dictionary and process documentation
- Take ownership and can work autonomously
Preferred Qualifications
- Kafka, HDFS, Hive, Cloud computing experience is a plus
- Experience with Continuous integration and deployment
- Knowledge and experience of financial markets, banking or exchanges
Benefits
- Competitive starting salary
- A discretionary annual bonus
- Long-term incentive in the form of a new hire equity grant
- Comprehensive health plans
- 401K with company matching
- Paid Parental Leave
- Flexible time off
Share this job:
Disclaimer: Please check that the job is real before you apply. Applying might take you to another website that we don't own. Please be aware that any actions taken during the application process are solely your responsibility, and we bear no responsibility for any outcomes.
Similar Remote Jobs
πArgentina
π°$175k-$210k
πUnited States
π°$175k-$210k
πUnited States
πCanada
π°$175k-$210k
πUnited States
πIndia
π°$225k-$255k
πUnited States
π°$204k-$259k
πUnited States
πIndia