Related skills
terraform aws snowflake sql dbt📋 Description
- Design and scale data platform on Snowflake in AWS.
- Define data modeling standards: dimensional, data vault, semantic models.
- Architect unified data foundations across GRC product domains.
- Lead dbt, Airbyte/Fivetran, Airflow, Terraform implementations.
- Collaborate with product and engineering teams to translate requirements and drive strategy.
🎯 Requirements
- Bachelor's degree in Computer Science, Information Systems, Data Science, or related field.
- 8-10 years in Data Engineering / Architecture / Cloud Data Platform.
- Experience designing scalable, multi-domain data models.
- Proficiency with Snowflake, dbt, Terraform, Airbyte, and Fivetran.
- Extensive AWS cloud experience for infra design and maintenance.
- SQL and data modeling; ETL/ELT and data integration.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest — finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!