Added
less than a minute ago
Type
Full time
Salary
Upgrade to Premium to se...

Related skills

terraform aws sql python databricks

πŸ“‹ Description

  • Architect scalable data platforms for real-time and batch analytics.
  • Own data systems end-to-end: ingestion, streaming, transformation, storage, serving.
  • Design and implement distributed data processing with Spark and Databricks on AWS.
  • Build and optimize pipelines with Airflow and modern orchestration frameworks.
  • Define and enforce CI/CD, IaC, testing, and observability standards.
  • Harden pipelines with monitoring, alerts, SLAs, and recovery mechanisms.

🎯 Requirements

  • 8+ years designing and operating high-volume distributed data systems in production.
  • Deep expertise with cloud data platforms (Databricks preferred) and AWS; tuning/cost optimization.
  • Strong proficiency in Python, SQL, and Spark for large-scale processing.
  • Hands-on dbt experience; awareness of platform decisions on modeling.
  • Experience with Airflow and modern CI/CD practices (GitHub Actions, Terraform).
  • Excellent communication skills across engineering, analytics, product, and executive stakeholders.
  • BS in Computer Science, Engineering, Mathematics, or equivalent experience.

🎁 Benefits

  • Medical, dental, vision, life, and disability insurance (US paid; Canadian supplements).
  • 401(k) with company matching (US) and RRSP with DPSP for Canada.
  • Employee Assistance Program (EAP) for mental wellness.
  • Flexible PTO and 12 company-wide days off.
  • Equipment, tools, and reimbursement for a productive remote environment.
  • Free Life360 Platinum Membership for your preferred circle.
Share job

Meet JobCopilot: Your Personal AI Job Hunter

Automatically Apply to Data Jobs. Just set your preferences and Job Copilot will do the rest β€” finding, filtering, and applying while you focus on what matters.

Related Data Jobs

See more Data jobs β†’