Added
less than a minute ago
Type
Full time
Salary
Upgrade to Premium to se...

Related skills

aws sql python databricks spark

πŸ“‹ Description

  • Design and develop robust data pipelines using Python, Spark, Databricks, SQL, SSIS
  • Implement and orchestrate ETL/ELT workflows with Apache Airflow and SSIS
  • Build repeatable processes for ingestion and transformation of large healthcare datasets
  • Integrate data from diverse sources (AWS, on-prem, vendors) into our data platform
  • Work with CSV, XML, Parquet, Delta formats and ensure data quality
  • Automate observability and monitoring across pipelines and workloads

🎯 Requirements

  • SQL and Python or Scala proficiency
  • Spark and Databricks experience
  • Airflow or similar orchestration tools
  • Data cleansing, curation, quality frameworks
  • Unity Catalog or metadata management familiarity
  • AWS cloud services and data governance awareness

🎁 Benefits

  • Comprehensive benefits package including medical, dental, and vision
  • Unlimited paid time off
  • 401(k) plan with employer contribution
Share job

Meet JobCopilot: Your Personal AI Job Hunter

Automatically Apply to Data Jobs. Just set your preferences and Job Copilot will do the rest β€” finding, filtering, and applying while you focus on what matters.

Related Data Jobs

See more Data jobs β†’