Related skills
aws sql python gcp dbtπ Description
- Design, build, and maintain scalable data pipelines.
- Implement data quality checks and governance practices.
- Develop and optimize dbt models for analytics.
- Troubleshoot data issues in production environments.
- Ensure data platforms support analytics and AI initiatives.
- Collaborate with analysts to ensure data accessibility.
π― Requirements
- 7+ years in data engineering with modern data stack tools.
- Advanced SQL and Python for data engineering tasks.
- Strong data modeling and warehousing fundamentals.
- Experience with orchestration tools like mage.ai or Airflow, dbt, ELT tools such as Fivetran or Airbyte.
- Cloud platforms such as GCP, AWS, or Azure.
- End to end production data pipelines experience.
π Benefits
- Work from home remotely.
- 25 paid vacation days.
- Health insurance with travel and dental.
- Meal allowance of 150β¬ plus flexible plan.
- Home office setup allowance.
- Spanish language classes.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!