Related skills
etl data warehouse sql python data modeling📋 Description
- Design, build, and maintain ETL data pipelines into the data warehouse.
- Develop and optimize data warehouse schemas for analytics.
- Write complex SQL queries; use Python for data processing.
- Implement data quality checks and cleansing routines.
- Collaborate with data analysts, scientists, and engineers to meet needs.
- Document data flows and definitions; uphold team standards.
🎯 Requirements
- Bachelor’s or Master’s in Computer Science, Math, Physics.
- At least 3 years in data engineering or backend data development.
- Strong SQL skills and data modeling, data warehouse experience.
- Proficiency in Python for data processing and automation.
- Familiarity with ETL tools and workflow orchestration (Airflow or similar).
- Experience with data quality checks and large-scale datasets.
🎁 Benefits
- Stock grant opportunities dependent on role, status and location.
- Additional perks and benefits based on employment status and country.
- Remote work flexibility, including optional WeWork access.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest — finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!