Related skills
snowflake sql python databricks dbt📋 Description
- Build and maintain batch and real-time data pipelines using Spark, Databricks, and Snowflake.
- Ingest, transform, and deliver structured and semi-structured data across cloud platforms.
- Optimize pipelines for scalability and cost efficiency.
- Use Airflow, Temporal, and dbt to automate scheduling, monitoring, and versioning.
- Enforce data quality, lineage, and schema enforcement.
- Support HIPAA and GDPR compliance with secure data handling and access controls.
🎯 Requirements
- Bachelor’s degree in CS, Engineering, or related field.
- 2-5 years experience in data engineering or data platforms.
- Healthcare data domain/EHR exposure preferred.
- Python and SQL for data workflows.
- Spark/Databricks for distributed processing.
- Airflow, Temporal, and dbt for orchestration.
- Azure Data Factory and Databricks on Azure.
- FHIR/HL7/ICD/SNOMED familiarity.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Engineering Jobs. Just set your
preferences and Job Copilot will do the rest — finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!