Related skills
aws snowflake python kubernetes dbt📋 Description
- Build and maintain data pipelines and data models
- End-to-end development of data infrastructure and processing
- Operate parts of production data systems
- Collaborate with researchers, architects, and engineers
- Hybrid Tel Aviv office role (2 days/week)
- Work with modern data stack (Python, Spark, Airflow, DBT)
🎯 Requirements
- 6+ years designing server-side data solutions
- Build/optimize large-scale data pipelines and architectures
- AWS ecosystem experience
- Data warehouse skills (Snowflake/Redshift/Databricks)
- Proficient in Python; Kubernetes in production
- GenAI in data flows; Airflow familiarity; Spark/Hadoop knowledge
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest — finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!