Related skills
snowflake sql python dbt airflow๐ Description
- Design and build scalable data products for internal analytics using dbt, Snowflake, Dagster.
- Translate business needs into structured data assets; end-to-end design to deployment.
- Contribute to platform tooling for orchestration, testing, access control, observability.
- Evaluate new platform capabilities (Semantic Views, Cortex, SnowML).
- Support teammates: onboarding, data issues, documenting patterns.
- Maintain data reliability and quality via monitoring, alerts, tests.
๐ฏ Requirements
- 3+ years in data/analytics engineering; modern stack (Snowflake, dbt, Dagster, Airflow).
- SQL fluency; write performant, modular transformations at scale.
- Python for data orchestration, transformations, testing.
- Understand modeling tradeoffs (Kimball vs OBT).
- Communicate technical concepts to engineers and non-technical stakeholders.
- Think ROI, know when to automate, balance long-term quality with delivery.
๐ Benefits
- Medical, dental, and vision coverage starting on Day 1
- Equity (ISOs)
- 401(k) program
- Family planning programs + paid parental leave
- Physical fitness and wellness memberships
- Emotional and mental health support programs
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest โ finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!