Added
less than a minute ago
Location
Type
Full time
Salary
Upgrade to Premium to se...
Related skills
aws sql python databricks sparkπ Description
- Design and develop robust data pipelines using Python, Spark, Databricks, SQL, SSIS
- Implement and orchestrate ETL/ELT workflows with Apache Airflow and SSIS
- Build repeatable processes for ingestion and transformation of large healthcare datasets
- Integrate data from diverse sources (AWS, on-prem, vendors) into our data platform
- Work with CSV, XML, Parquet, Delta formats and ensure data quality
- Automate observability and monitoring across pipelines and workloads
π― Requirements
- SQL and Python or Scala proficiency
- Spark and Databricks experience
- Airflow or similar orchestration tools
- Data cleansing, curation, quality frameworks
- Unity Catalog or metadata management familiarity
- AWS cloud services and data governance awareness
π Benefits
- Comprehensive benefits package including medical, dental, and vision
- Unlimited paid time off
- 401(k) plan with employer contribution
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!