Related skills
snowflake sql python dbt airflow📋 Description
- Build Bronze and Silver pipelines with full data lineage.
- Own ingestion, transformation, and enrichment of healthcare data.
- Translate requirements into production pipelines with enterprise patterns.
- Implement automated ingestion using Snowpipe, Matillion, or custom tools.
- Develop dbt models with medallion architecture and incremental processing.
- Ensure data quality, observability, and incident response.
- Remote-friendly role; occasional travel for stakeholder meetings.
🎯 Requirements
- 5+ years data engineering in cloud environments.
- 2+ years Snowflake, Snowpipe, and optimization.
- 2+ years dbt (Core or Cloud) in production, incl incremental models.
- Healthcare data experience; ICD-10, NPI, NDC codes.
- Production experience with Airflow, Dagster, or similar.
- Strong SQL; readable, efficient, and debuggable.
- Python for data processing and automation.
- Bachelor’s degree in CS/IS/Engineering; advanced degree a plus.
🎁 Benefits
- Remote-friendly role; EU/India candidates welcome; occasional travel.
- Autonomy in a fast-growing team; impactful data work.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest — finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!