Related skills
looker snowflake sql python dbtπ Description
- Design, implement, and validate ETL/ELT pipelines for batch and streaming data.
- Maintain Snowflake data warehouse deployments and Denodo virtualization.
- Recommend process improvements to ELT/ETL efficiency.
- Stay current on data technologies and support pilots; ensure scalable platform.
- Architect, implement scalable pipelines to real-time data warehouses.
- Partner with data stakeholders for language-model requirements and scalable solutions.
π― Requirements
- 5+ years in ETL/ELT design and development with heterogeneous sources and data warehouses.
- Excellent English communication skills.
- Effective oral and written communication with BI team and user community.
- Proficient in Python for data engineering tasks and large-scale processing.
- Design event-driven pipelines using messaging/streaming to trigger ETL workflows.
- Experience in data analysis and problem solving.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!