Related skills
python azure data factory delta lake azure databricks data lake storageπ Description
- Design and build modern data solutions, including data lakes.
- Develop robust, scalable data pipelines (batch and streaming).
- Collaborate with Architects, Data Scientists, and DevOps on data quality.
- Create and automate data processes to ensure availability.
- Write clean, well-documented code for data engineering tasks.
- Monitor data quality with automated testing frameworks.
π― Requirements
- Azure Data Factory and Azure Databricks expertise.
- SQL and Python programming.
- Azure SQL Database and CosmosDB experience.
- Data Lake Storage; Parquet/Delta Lake formats.
- Hands-on with Azure data platforms and big data tooling.
- Production-grade ETL/ELT pipelines design.
π Benefits
- Flexible, distributed work model.
- Medical, dental, and vision benefits.
- 8 weeks parental leave and backup care days.
- TechEleX devices with 3-year refresh.
- Internet reimbursement and wellness programs.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!