Added
3 hours ago
Type
Full time
Salary
Salary not provided

Related skills

aws sql python gcp dbt

πŸ“‹ Description

  • Design, build, and maintain scalable data pipelines.
  • Implement data quality checks and governance practices.
  • Develop and optimize dbt models for analytics.
  • Troubleshoot data issues in production environments.
  • Ensure data platforms support analytics and AI initiatives.
  • Collaborate with analysts to ensure data accessibility.

🎯 Requirements

  • 7+ years in data engineering with modern data stack tools.
  • Advanced SQL and Python for data engineering tasks.
  • Strong data modeling and warehousing fundamentals.
  • Experience with orchestration tools like mage.ai or Airflow, dbt, ELT tools such as Fivetran or Airbyte.
  • Cloud platforms such as GCP, AWS, or Azure.
  • End to end production data pipelines experience.

🎁 Benefits

  • Work from home remotely.
  • 25 paid vacation days.
  • Health insurance with travel and dental.
  • Meal allowance of 150€ plus flexible plan.
  • Home office setup allowance.
  • Spanish language classes.
Share job

Meet JobCopilot: Your Personal AI Job Hunter

Automatically Apply to Data Jobs. Just set your preferences and Job Copilot will do the rest β€” finding, filtering, and applying while you focus on what matters.

Related Data Jobs

See more Data jobs β†’