Related skills
terraform snowflake sql python databricks📋 Description
- Design, develop, and manage scalable ETL pipelines with engineering/product teams.
- Create and optimize data models and schemas on Block data (event, customer, process data).
- Build monitoring for infrastructure health, data quality, and lineage.
- On-call rotation; monitor daily jobs; diagnose issues; fix pipelines to meet SLAs.
- Collaborate with non-technical partners to translate needs into data requirements.
- Standardize business/product metric definitions and develop data dictionaries.
🎯 Requirements
- 8+ years experience with a degree, or 6+ with a Master’s (or equivalent).
- High proficiency in SQL.
- Experience with Python and Terraform.
- Design large data engineering solutions across full lifecycle: scope, design, build, test, deploy, document.
- ETL scheduling with Airflow/Prefect; schema design and dimensional modeling.
- Data quality and data lineage monitoring experience.
🎁 Benefits
- Zone-based pay ranges by zone (A-D) in USD.
- Remote work, medical insurance, flexible time off, retirement savings, and family planning.
- AI-based hiring tools may be used with privacy compliance.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Engineering Jobs. Just set your
preferences and Job Copilot will do the rest — finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!