Related skills
terraform aws sql python dbt📋 Description
- Design, develop, and maintain scalable data pipelines using Airflow.
- Create and optimize Athena tables for performance and cost.
- Write and manage SQL transformations in DBT with reusable models.
- Optimize data workflows for performance, reliability, and cost-efficiency.
- Automate infrastructure provisioning with Terraform for data environments.
- Ensure data integrity via monitoring, validation, and error handling.
🎯 Requirements
- 1+ years of data engineering experience focusing on ELT.
- Proficient in Python, SQL, data modeling, and performance optimization.
- Experience with Airflow DAGs for workflow orchestration.
- Experience with DBT for data transformations and modular modeling.
- Understanding of AWS S3, Athena, and data lake architectures.
- Familiarity with Terraform for infrastructure automation (nice to have).
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest — finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!