Added
21 days ago
Type
Full time
Salary
Salary not provided

Related skills

aws snowflake sql python dbt

📋 Description

  • Design and maintain batch and streaming pipelines across AWS, Airflow, DBT, Snowflake.
  • Develop and optimize data models in Snowflake for quality and performance at scale.
  • Collaborate with Product Analysts and AI teams on segmentation and predictive models.
  • Partner with cross-functional teams to translate requirements into scalable data architecture.
  • Implement end-to-end observability and cost/performance optimization in Snowflake and AWS.

🎯 Requirements

  • 6+ years of experience as a Data Engineer or similar role.
  • Expert SQL skills and hands-on data warehouse experience (Snowflake a plus).
  • Strong ETL/ELT experience with DBT, Airflow, or similar.
  • Proficiency in Python or another programming language for data processing.
  • Strong knowledge of data modeling (dimensional modeling, Data Vault, etc.).
  • Experience with cloud platforms, preferably AWS.
Share job

Meet JobCopilot: Your Personal AI Job Hunter

Automatically Apply to Engineering Jobs. Just set your preferences and Job Copilot will do the rest — finding, filtering, and applying while you focus on what matters.

Related Engineering Jobs

See more Engineering jobs →