Related skills
terraform aws snowflake sql pythonπ Description
- Collaborate & Strategize: Partner with stakeholders to design end-to-end data architecture.
- Build & Maintain Data Models: Snowflake/Iceberg data models with dbt and SQL.
- Orchestrate & Automate: Data pipelines and CI/CD with Airflow, Python, Terraform.
- Champion Data Quality: Implement tests, docs, governance for a single source of truth.
- Enable Analytics & Workflows: Own data domains; deliver analytics/data apps in Tableau/Retool.
- Innovate with AI: Integrate AI (Snowflake Cortex AI) to democratize analytics.
π― Requirements
- 6+ years in Analytics or Data Engineering.
- Deep SQL expertise and dimensional data modeling.
- Snowflake (preferred) and Iceberg experience.
- dbt for data transformation.
- Airflow, Fivetran, Airbyte for orchestration/ETL.
- AWS, Python, and Terraform for cloud, ingestion, and IaC.
π Benefits
- Equal opportunity employer; diverse workforce.
- Candidate Privacy Notices apply.
- Motive Perks & Benefits program.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!