Related skills
snowflake etl sql data modeling git📋 Description
- Design, build, test, and maintain scalable data pipelines for core Insurance systems.
- Ingest data from sources into the data platform (e.g., Snowflake) with reliability.
- Develop and maintain canonical data models and curated datasets for analytics.
- Implement data quality checks, reconciliations, and automated testing.
- Partner with cross-functional stakeholders to map requirements into data transformations.
- Establish documentation, metadata, and data governance practices.
🎯 Requirements
- Bachelor’s degree in CS/Engineering or equivalent experience
- 4+ years building and supporting production data pipelines
- Strong SQL skills with complex queries and optimization
- Experience with cloud data warehouses (Snowflake) and ELT/ETL
- Knowledge of data modeling (dimensional, SCDs, Data Vault)
- Experience with Git and collaborative engineering workflows
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest — finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!