Related skills
bigquery looker terraform sql pythonπ Description
- Design, build, and maintain batch/real-time ETL/ELT pipelines on GCP.
- Develop and manage BigQuery data warehouse with scalable models.
- Write SQL and Python to transform data into analytics-ready datasets.
- Optimize data infra for performance and cost (partitioning, clustering, views).
- Create data models for BI tools such as Looker.
- Implement automated data quality checks and data governance.
π― Requirements
- 3-5+ years in Data Eng or similar role.
- Strong Python or Java for data processing.
- Expert SQL proficiency.
- BigQuery: data modeling, performance, and cost control.
- GCP data services: Dataflow, Pub/Sub, Cloud Storage, Cloud Composer/Airflow.
- ETL/ELT concepts and data warehousing architecture.
π Benefits
- Hybrid flexibility: 3+ days per week in Toronto office.
- Downtown Toronto office.
- 100% health, dental, and vision premiums for you and dependents.
- Growth & learning resources to level up your skills.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!