Related skills
bash aws snowflake sql python📋 Description
- Build and maintain robust data pipelines using Snowflake and AWS
- Architect data storage and processing in Hadoop ecosystems
- Lead projects and coordinate with stakeholders
- Orchestrate data workflows with Luigi and automation scripts
- Monitor pipelines and troubleshoot issues in real time
- Collaborate with teams for data-driven solutions
🎯 Requirements
- 5-10 years data engineering experience
- Snowflake expertise
- Python and Linux Bash scripting
- Luigi workflow orchestration
- Hadoop ecosystem and SQL databases
- AWS EC2, S3, RDS, EMR experience
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest — finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!