Added
8 days ago
Type
Full time
Salary
Upgrade to Premium to se...

Related skills

etl sql python spark delta lake

πŸ“‹ Description

  • Design, develop, and operate large-scale data pipelines
  • Improve and automate internal processes
  • Monitor jobs and pipelines and alert the team
  • Integrate data sources to meet business needs
  • Write robust, maintainable, well-documented code

🎯 Requirements

  • 2-4 years Data Eng and data warehousing experience
  • Strong Python, Parquet, Spark, Azure Databricks, Delta Lake
  • Informatica ETL and Sigma analytics environments
  • SQL development: procedures, triggers, indexing, partitioning
  • ETL/ELT and data-warehousing best practices
  • CI/CD tools and Azure cloud familiarity
Share job

Meet JobCopilot: Your Personal AI Job Hunter

Automatically Apply to Data Jobs. Just set your preferences and Job Copilot will do the rest β€” finding, filtering, and applying while you focus on what matters.

Related Data Jobs

See more Data jobs β†’