Related skills
terraform aws python databricks ci/cdπ Description
- Own design, build, and scale data pipelines on Databricks and Airflow.
- Drive reliability, performance, and observability of data workloads.
- Lead design and code reviews, setting quality standards across the team.
- Make independent technical decisions on pipeline architecture and tooling.
- Own production monitoring, incident response, and runbooks.
- Document pipelines, data models, and platform workflows.
π― Requirements
- 3+ years Airflow DAGs for production workloads.
- 3+ years Python in data engineering.
- 3+ years Databricks/Delta Lake ETL/ELT.
- Cloud experience in AWS, Azure, or GCP.
- Terraform (IaC) experience.
- Git-based workflows and CI/CD for data/platform.
π Benefits
- Health plans, PTO and sick leave, and other benefits.
- 401k with up to 5% match and long-term incentives.
- Commuter benefits and pet insurance.
- Annual bonus and long-term incentive opportunities.
- Benefits may vary by location.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!