Added
1 day ago
Location
Type
Full time
Salary
Salary not provided
Related skills
bigquery snowflake python databricks goπ Description
- Ingestion & Pipeline Infrastructure - scalable ingestion for batch/stream data sources.
- Data Warehouse & Lakehouse Architecture - core analytical data platform (warehouse, lakehouse, or hybrid).
- Transformation & Orchestration Layer - scalable transformation pipelines (dbt, Spark).
- Data Serving & Access Layer - low-latency access for analytical and operational use.
- Observability & Data Quality - data quality monitoring and freshness checks.
- Data Catalog & Discoverability - metadata systems and data discoverability.
π― Requirements
- 5+ years building and operating production data platforms.
- Python, Scala, or Go; distributed data systems with high reliability.
- Data warehousing & Lakehouse: Snowflake, BigQuery, Databricks, Delta Lake.
- Streaming & Batch Pipelines: Kafka, Flink; Spark, dbt.
- Data Modeling & Transformation: standards, frameworks, testing.
- Observability & Operations: SLOs, cost optimization, incident response.
π Benefits
- Hybrid in-office model in Toronto and Montreal (2 days/week).
- Team lunches, game nights, company-wide events.
- Growth culture with data-driven decision making.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!