Related skills
etl sql python spark delta lakeπ Description
- Design, develop, and operate large-scale data pipelines
- Improve and automate internal processes
- Monitor jobs and pipelines and alert the team
- Integrate data sources to meet business needs
- Write robust, maintainable, well-documented code
π― Requirements
- 2-4 years Data Eng and data warehousing experience
- Strong Python, Parquet, Spark, Azure Databricks, Delta Lake
- Informatica ETL and Sigma analytics environments
- SQL development: procedures, triggers, indexing, partitioning
- ETL/ELT and data-warehousing best practices
- CI/CD tools and Azure cloud familiarity
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!