Added
less than a minute ago
Location
Type
Full time
Salary
Upgrade to Premium to se...
Related skills
terraform aws sql python databricksπ Description
- Architect scalable data platforms for real-time and batch analytics.
- Own data systems end-to-end: ingestion, streaming, transformation, storage, serving.
- Design and implement distributed data processing with Spark and Databricks on AWS.
- Build and optimize pipelines with Airflow and modern orchestration frameworks.
- Define and enforce CI/CD, IaC, testing, and observability standards.
- Harden pipelines with monitoring, alerts, SLAs, and recovery mechanisms.
π― Requirements
- 8+ years designing and operating high-volume distributed data systems in production.
- Deep expertise with cloud data platforms (Databricks preferred) and AWS; tuning/cost optimization.
- Strong proficiency in Python, SQL, and Spark for large-scale processing.
- Hands-on dbt experience; awareness of platform decisions on modeling.
- Experience with Airflow and modern CI/CD practices (GitHub Actions, Terraform).
- Excellent communication skills across engineering, analytics, product, and executive stakeholders.
- BS in Computer Science, Engineering, Mathematics, or equivalent experience.
π Benefits
- Medical, dental, vision, life, and disability insurance (US paid; Canadian supplements).
- 401(k) with company matching (US) and RRSP with DPSP for Canada.
- Employee Assistance Program (EAP) for mental wellness.
- Flexible PTO and 12 company-wide days off.
- Equipment, tools, and reimbursement for a productive remote environment.
- Free Life360 Platinum Membership for your preferred circle.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!