Added
6 days ago
Type
Full time
Salary
Salary not provided

Related skills

looker snowflake sql python dbt

πŸ“‹ Description

  • Design, implement, and validate ETL/ELT pipelines for batch and streaming data.
  • Maintain Snowflake data warehouse deployments and Denodo virtualization.
  • Recommend process improvements to ELT/ETL efficiency.
  • Stay current on data technologies and support pilots; ensure scalable platform.
  • Architect, implement scalable pipelines to real-time data warehouses.
  • Partner with data stakeholders for language-model requirements and scalable solutions.

🎯 Requirements

  • 5+ years in ETL/ELT design and development with heterogeneous sources and data warehouses.
  • Excellent English communication skills.
  • Effective oral and written communication with BI team and user community.
  • Proficient in Python for data engineering tasks and large-scale processing.
  • Design event-driven pipelines using messaging/streaming to trigger ETL workflows.
  • Experience in data analysis and problem solving.
Share job

Meet JobCopilot: Your Personal AI Job Hunter

Automatically Apply to Data Jobs. Just set your preferences and Job Copilot will do the rest β€” finding, filtering, and applying while you focus on what matters.

Related Data Jobs

See more Data jobs β†’