Related skills
azure etl databricks power bi data pipelines📋 Description
- Assess legacy data and reporting workloads to identify automation opportunities.
- Migrate Informatica data workloads to Databricks, ensuring performance and data integrity.
- Design and execute an automation-first modernization for data pipelines, reporting, analytics.
- Apply tool-assisted and AI-assisted techniques to accelerate modernization while ensuring compliance.
- Build repeatable frameworks for data ingestion, transformation, and orchestration.
- Establish governance to ensure modernization efforts are consistent, auditable, and scalable.
- Partner with distributed data teams to deploy patterns across environments.
- Provide hands-on technical leadership while executing.
🎯 Requirements
- 10+ years in data engineering and/or data architecture.
- Proven track record modernizing large-scale legacy data and reporting environments.
- Hands-on experience migrating Informatica data pipelines to Databricks.
- Demonstrated use of automation frameworks, accelerators, or AI-assisted tools to compress delivery timelines.
- Hands-on experience with Azure, Cloudera, and Power BI.
- Strong experience designing and operating ETL/ELT pipelines and analytics layers.
- Deep understanding of data quality, lineage, reconciliation, and validation frameworks.
- Experience working in large-scale or regulated enterprise environments.
- Experience in payer, healthcare, or other regulated domains.
- Experience designing solutions that support multi-environment scalability.
- Exposure to GenAI tooling for code generation and automation.
- Experience as a technical thought leader for new platform initiatives.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest — finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!