Related skills
snowflake sql dbt airflow fivetran📋 Description
- Architect and deliver end-to-end data pipelines for internal analytics.
- Design, build, and maintain scalable data pipelines for BI and analytics.
- Develop data flows and apps for ML, analytics, and third-party integrations.
- Collaborate with data scientists and analysts to deploy models and analytical solutions.
- Perform heads-down development: test, review code, and ship to production.
- Help build data infrastructure powering internal reporting to guide decisions.
🎯 Requirements
- Strong SQL skills and Snowflake core features; performance optimization.
- Build scalable, tested transformation layers using DBT.
- Experience managing data pipelines with Airflow and ingestion tools (Fivetran or Estuary).
- Comfortable with AI-assisted development tools like Snowflake Cortex, Gemini, or GitHub Copilot.
- Strong Git workflow experience in Agile environments (Jira).
- Bachelor’s degree in Computer Science or related technical field.
🎁 Benefits
- AI features and cutting-edge data tech to drive insights.
- Collaborative, team-first culture and growth opportunities.
- Global team across geographies and roles.
- Inclusive, Equal Opportunity Employer.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Engineering Jobs. Just set your
preferences and Job Copilot will do the rest — finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!