Related skills
python azure data factory data lake microsoft fabric azure sql📋 Description
- Build, run, and maintain robust data pipelines with Azure Data Factory in production.
- Integrate data between internal systems and third-party SaaS platforms.
- Develop and optimize SQL data stores with table design, indexing, and tuning.
- Introduce Microsoft Fabric components (pipelines, data agents) to level up the platform.
- Monitor, troubleshoot, and resolve pipeline and data quality issues.
- Maintain cloud data stores for analytics, automation, and AI workloads.
🎯 Requirements
- Bachelor’s degree in computer science or equivalent.
- 3–6 years’ experience in data/analytics engineering or cloud platforms.
- Hands-on experience with Azure Data Factory and Azure data pipelines.
- Strong SQL skills: query optimization, table design, performance tuning.
- Experience integrating data from internal systems and SaaS products.
- Familiarity with Microsoft Fabric (pipelines/notebooks/data agents) is a plus.
🎁 Benefits
- Collaborative teams with a flat structure and accessible leadership.
- Be involved in quarterly product and tech roadmap and strategy sessions.
- Flexible working environment with North Sydney offices and flexible policy.
- 26 weeks paid parental leave.
- Purpose-built spaces for collaboration and focused work.
- Wellbeing benefits including time off and end-of-trip facilities.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest — finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!