Related skills
etl data warehouse sql python airflowπ Description
- Design, build, and maintain data pipelines (ETL) to load data into the data warehouse.
- Develop and optimize data warehouse schemas and tables for analytics.
- Write SQL queries and use Python to transform and aggregate data.
- Implement data quality checks and cleansing routines to ensure data integrity.
- Collaborate with data analysts, data scientists, and engineers to meet data needs.
- Handle multiple tasks, prioritize work, and communicate progress to stakeholders.
π― Requirements
- At least 3 years of data engineering or backend data development experience.
- Strong SQL skills and data modeling for data warehouses.
- Proficiency in Python for data processing and pipelines.
- Familiarity with ETL tools and workflow orchestration (e.g., Airflow).
- Experience implementing data quality checks on large-scale datasets.
- Strong problem-solving, communication, and teamwork for cross-functional work.
π Benefits
- Stock grant opportunities dependent on role, employment status and location.
- Additional perks and benefits based on employment status and country.
- The flexibility of remote work, including optional WeWork access.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!