Related skills
terraform snowflake python dbt airflow๐ Description
- Architect and optimize cloud-native data warehouse solutions for real-time decisions
- Design and scale dbt models across bronze/silver with data contracts and observability
- Build ETL/ELT pipelines with Fivetran, Snowpipe, and CDC for streaming and batch workloads
- Lead IaC deployments with Pulumi or Terraform, advancing CI/CD maturity
- Improve data quality with monitoring, anomaly detection, and validation
- Collaborate across squads on data contracts and shared data marts to enable ML
- Champion data governance with privacy tagging, lineage, and RBAC
- Drive platform improvements through tech debt remediation and modernization
๐ฏ Requirements
- Expertise in data engineering, software engineering, or data warehousing
- Deep Snowflake expertise (Snowpark, Streams, Tasks, Data Sharing) and SQL/dbt
- Proficient in Python for automation and data platform development
- Experience with observability, anomaly detection, and data quality validation
- Skilled at leading multi-squad initiatives with scalable outcomes
- Hands-on with Airflow, Atlan, and IaC (Pulumi or Terraform)
- Strong data modeling and patterns (star schema, bronze/silver/gold)
- Mentor engineers and influence roadmaps
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Engineering Jobs. Just set your
preferences and Job Copilot will do the rest โ finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!