Related skills
sql python dbt airflow google bigqueryπ Description
- Own the technical vision and architecture for the Unified Customer Data Mart.
- Design and implement end-to-end data architectures for Bronze-to-Gold pipelines powering CDP and analytics.
- Define and evolve data modeling standards across customer data domains.
- Architect production-grade data pipelines using DBT and Airflow to support analytics at scale.
- Partner with stakeholders to translate business needs into robust data designs.
- Lead and mentor cross-functional teams, setting standards for quality, reviews, and governance.
π― Requirements
- SQL and Python expert; design, optimize, and troubleshoot distributed data systems.
- 5+ years in data engineering/architecture building enterprise-scale platforms.
- Extensive experience designing data lakes, cloud warehouses, and modern analytics architectures.
- Hands-on DBT for transformations and modular data modeling.
- BigQuery (mandatory) or Snowflake/Databricks; and Airflow or similar orchestration.
- Experience with medallion or layered data architectures and dimensional modeling.
π Benefits
- Hybrid flexibility: 4 days per week in downtown Toronto office.
- 100% health, dental, and vision coverage for you and dependents from day one.
- Build AI systems used by Fortune 500 companies and see real impact.
- Growth & Learning: continuous learning opportunities and influence over technical direction.
- Ownership: shape applied research and AI strategy in a fast-growing data company.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!