Related skills
etl databricks data pipelines oracle delta lake๐ Description
- Analyze SSIS packages, SQL Server Agent jobs, and cross-database sync.
- Document data flows, schedules, orchestration, and SQL Server/Oracle dependencies.
- Identify bottlenecks and risks; define migration sequencing and complexity.
- Define target Databricks/Delta Lake architecture for the lakehouse modernization.
- Design Databricks-based ETL and job orchestration equivalents.
- Provide migration blueprint from SSIS to Databricks pipelines.
๐ฏ Requirements
- Databricks experience: clusters, jobs, notebooks, Delta Lake.
- Experience modernizing/migrating SSIS, Informatica, or similar ETL.
- Data pipelines with SQL Server and Oracle.
- Data synchronization patterns and incremental loading.
- Ability to analyze large legacy data workflows and define modernization paths.
- Strong communication; collaborate with architecture and engineering teams.
๐ Benefits
- Short-term engagement with high-impact contribution.
- Collaborative architecture and engineering team environment.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest โ finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!