Added
3 days ago
Type
Full time
Salary
Salary not provided

Related skills

etl data warehouse sql python airflow

πŸ“‹ Description

  • Design, build, and maintain data pipelines (ETL) to load data into the data warehouse.
  • Develop and optimize data warehouse schemas and tables for analytics.
  • Write SQL queries and use Python to transform and aggregate data.
  • Implement data quality checks and cleansing routines to ensure data integrity.
  • Collaborate with data analysts, data scientists, and engineers to meet data needs.
  • Handle multiple tasks, prioritize work, and communicate progress to stakeholders.

🎯 Requirements

  • At least 3 years of data engineering or backend data development experience.
  • Strong SQL skills and data modeling for data warehouses.
  • Proficiency in Python for data processing and pipelines.
  • Familiarity with ETL tools and workflow orchestration (e.g., Airflow).
  • Experience implementing data quality checks on large-scale datasets.
  • Strong problem-solving, communication, and teamwork for cross-functional work.

🎁 Benefits

  • Stock grant opportunities dependent on role, employment status and location.
  • Additional perks and benefits based on employment status and country.
  • The flexibility of remote work, including optional WeWork access.
Share job

Meet JobCopilot: Your Personal AI Job Hunter

Automatically Apply to Data Jobs. Just set your preferences and Job Copilot will do the rest β€” finding, filtering, and applying while you focus on what matters.

Related Data Jobs

See more Data jobs β†’