Related skills
sql python databricks dbt airflow📋 Description
- Design and optimize data pipelines for structured and unstructured data at scale
- Build and maintain data models (fact, dim, SCD2, marts)
- Create reports and dashboards to support business performance analysis
- Partner with product, data science, and analytics teams to enable AI features
- Support data architecture transformation and migration on Databricks
- Troubleshoot data quality and performance issues quickly
🎯 Requirements
- 5+ years of software/data engineering experience
- Expertise in SQL (window functions, CTEs, complex joins)
- Intermediate–advanced Python (OOP, testing, package mgmt)
- Experience with Databricks (streaming, Unity Catalog, resource mgmt)
- Experience with Spark (partitioning, skew handling, delta ops, Spark UI)
- Strong knowledge of data modeling and architecture
🎁 Benefits
- Stock options + TFSA/RRSP with 4% company match
- Comprehensive medical, dental, and vision for you and your dependents
- Flex time off + holidays + focused work periods
- Maternity/Parental Leave EI top-up support (after 6 months)
- Work From Anywhere Month + meeting-free weeks yearly
- Life insurance + short/long-term disability coverage
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Engineering Jobs. Just set your
preferences and Job Copilot will do the rest — finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!