Related skills
snowflake etl sql python dbtπ Description
- Build and maintain the Snowflake environment and cloud data platform.
- Architect scalable data solutions; data warehouse and datamarts design.
- Engineer ETL/ELT pipelines to support analytics and decision-making.
- Collaborate with analysts and stakeholders on data needs and AI tools.
- Elevate data foundation with docs, governance, and best practices.
π― Requirements
- 3-5 years in data engineering in production env.
- SQL and Python; write complex queries; automate data workflows.
- Hands-on Snowflake: manage objects, warehouses, roles, cost.
- ETL/ELT pipelines with dbt/Fivetran; Airflow; testing.
- Dimensional modeling (star/snowflake schemas).
- Collaborative; translate data concepts for non-technical audiences.
π Benefits
- Talented, collaborative people who love what they do.
- Learning platform for training and tools from day one.
- Surprise meal stipends for work-from-home.
- Work-life harmony: 20 days vacation, floating holidays, wellness allowance.
- Whole Health Package: medical, dental, vision, life, disability insurance, and more.
- Work-from-home stipend to set up your home office.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!