Related skills
snowflake sql python dbt apache airflow๐ Description
- Design and maintain scalable data pipelines.
- Collaborate with teams to deliver data-driven solutions.
- Work with Snowflake, dbt, and Airflow for data workflows.
- Handle data migrations and large datasets.
๐ฏ Requirements
- Bachelor's or Master's degree in Computer Science, Engineering, or related quantitative field.
- 3+ years experience in data engineering with scalable pipelines.
- Solid experience with data migration projects and large datasets.
- Hands-on Snowflake experience with data loading, querying, performance optimization.
- Proficiency in dbt for data transformation and modeling.
- Proven experience with Apache Airflow for scheduling and orchestrating data workflows.
๐ Benefits
- A High-Impact Environment
- Commitment to Professional Development
- Flexible and Collaborative Culture
- Global Opportunities
- Vibrant Community
- Total Rewards
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest โ finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!