Related skills
sql python dbt data pipelines data quality monitoring📋 Description
- Own end-to-end reliability for core data pipelines—from ingestion through billing and reporting.
- Build observability foundations: health dashboards, tiered alerts, actionable on-call signals.
- Harden ingestion with typed staging, quarantine patterns, freshness gates and safe backfill.
- Enforce schema contracts and quality validation to catch failures in CI.
- Establish clear boundaries between governed core data and downstream BI/reporting.
- Lead incident retrospectives that fix root causes and reduce on-call burden.
🎯 Requirements
- Bachelor's degree at a minimum.
- 7+ years in data engineering with ownership of production pipelines.
- Deep dbt experience with real deployments, not prototypes.
- Strong SQL and data modeling instincts with performance and cost awareness.
- Python fluency for tooling, validation and automation.
- Hands-on data quality monitoring: contracts, freshness checks, schema enforcement and alerting.
🎁 Benefits
- Remote, US-based role (#LI-remote).
- Equal opportunity employer; diversity and accommodations.
- No visa sponsorship; must be eligible to work in the U.S.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest — finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!