Related skills
terraform snowflake sql python databricks๐ Description
- Design, build, and maintain scalable data pipelines (batch and real-time).
- Develop production-grade data workflows for healthcare data.
- Work with Spark, Databricks, Airflow/Temporal, and dbt to ingest and manage data.
- Design reusable data models for analytics and AI/ML use cases.
- Ensure platform resiliency with CI/CD, observability, and logging.
- Leverage Infrastructure as Code (Terraform, CloudFormation) to manage cloud resources.
๐ฏ Requirements
- Bachelor's or Master's in Computer Science, Engineering, or related field.
- 8โ12 years in software or data engineering with cloud-native data systems.
- Healthcare domain software development experience (payers/providers/health tech).
- Experience with healthcare data types: FHIR, HL7, Claims, EDI, Epic/Clarity.
- Python, SQL, and Apache Spark proficiency.
- Azure cloud-native platform experience; IaC with Terraform/CloudFormation.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Engineering Jobs. Just set your
preferences and Job Copilot will do the rest โ finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!