Related skills
aws nosql python spark dataops📋 Description
- Define strategic roadmaps and outcomes for clients (consultative)
- Act as data engineering SME to define engagements with pre-sales teams
- Perform technical interviews for Architects and Engineers; provide guidance
- Oversee data standards and operating procedures
🎯 Requirements
- At least 4 years of experience in AWS data landscape
- 10+ years designing and building complex data systems
- Relational DB design, optimization, migration; BI dashboards
- Big data processing with Spark, streaming, and NoSQL
- SageMaker ML/MLOps; GenAI with Bedrock
- DataOps incl IaC, testing, versioning; CI/CD; Python analytics
🎁 Benefits
- Base Salary CAD 180k-202k per year, commensurate with experience
- 100% remote work
- 100% Premium Coverage for employee and dependents
- Competitive phantom equity
- 4% Pension match
- Unlimited Vacations
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest — finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!