Related skills
aws snowflake sql python dbt📋 Description
- Design and maintain batch and streaming pipelines across AWS, Airflow, DBT, Snowflake.
- Develop and optimize data models in Snowflake for quality and performance at scale.
- Collaborate with Product Analysts and AI teams on segmentation and predictive models.
- Partner with cross-functional teams to translate requirements into scalable data architecture.
- Implement end-to-end observability and cost/performance optimization in Snowflake and AWS.
🎯 Requirements
- 6+ years of experience as a Data Engineer or similar role.
- Expert SQL skills and hands-on data warehouse experience (Snowflake a plus).
- Strong ETL/ELT experience with DBT, Airflow, or similar.
- Proficiency in Python or another programming language for data processing.
- Strong knowledge of data modeling (dimensional modeling, Data Vault, etc.).
- Experience with cloud platforms, preferably AWS.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Engineering Jobs. Just set your
preferences and Job Copilot will do the rest — finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!