Related skills
terraform snowflake python dbt airflow๐ Description
- Architect and optimize cloud-native data warehouse solutions for real-time decisions.
- Design and scale dbt models across bronze/silver with data contracts and observability.
- Build and maintain robust ETL/ELT pipelines using Fivetran, Snowpipe, and CDC.
- Lead infrastructure-as-code deployments with Pulumi or Terraform, driving CI/CD maturity.
- Elevate data quality with monitoring, anomaly detection, and validation.
- Collaborate across squads to align data contracts and ML enablement.
๐ฏ Requirements
- Strong expertise in data engineering, software engineering, platform architecture, or modern data warehousing.
- Deep expertise in Snowflake (Snowpark, Streams, Tasks, Data Sharing) and SQL/dbt.
- Proficient in Python for automation, transformation logic, and data platform development.
- Experience implementing observability standards, anomaly detection, and data quality validation.
- Skilled in managing multi-squad initiatives with scalable outcomes.
- Hands-on knowledge of Airflow, Atlan, and infrastructure as code (Pulumi or Terraform).
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Engineering Jobs. Just set your
preferences and Job Copilot will do the rest โ finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!