Added
10 days ago
Type
Full time
Salary
Upgrade to Premium to se...
Related skills
bigquery aws sql python gcpπ Description
- Architect high-scale data infrastructure for pipelines.
- Build robust ELT/ETL workflows for data lakes and warehouses.
- Design cloud architectures on AWS/GCP with security governance.
- Optimize performance and scalability of data systems.
- Collaborate with Analytics/Product/Design to enable data work.
- Own data quality and reliability across datasets.
π― Requirements
- 3+ years in data engineering building scalable data systems.
- Hands-on AWS and/or GCP cloud expertise.
- Strong Python/Scala/Java and advanced SQL skills.
- Experience with Spark, Hadoop, Beam and Kafka/ Pub/Sub/ Kinesis.
- Proficient in Airflow or Dataflow and dbt.
- CI/CD, Docker, Kubernetes; Snowflake/BigQuery/Redshift knowledge.
π Benefits
- Competitive benefits package.
- Flex time off and career development.
- Hybrid work with in-office at Winnipeg, Vancouver, Hamilton.
- Top-tier hardware and tools (Mac/Windows/Linux).
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!