Related skills
azure docker aws postgresql python📋 Description
- Learn Airflow, data engineering, and cloud tech.
- Solve Airflow issues; optimize config and fix bugs.
- Work with Kubernetes, Docker, and containers.
- Collaborate with customers’ data engineers, admins, DevOps.
- Own the customer experience; meet SLAs with guidance.
- Participate remotely in a distributed team and on-call shifts.
🎯 Requirements
- 5 years of professional experience (any industry).
- 3 years Python experience.
- 1+ year Apache Airflow experience.
- Experience with Kubernetes/Docker/Container.
- Customer support experience; cloud exposure (AWS, GCP, Azure).
- Bonus: SQL & PostgreSQL; Databricks/Snowflake/Redshift/dbt.
🎁 Benefits
- Remote-friendly, fully distributed team.
- Time for side projects and open-source work.
- Learn from leaders of the Apache Airflow project.
- Work on a cloud-native product used by many customers.
- Exposure to product, engineering, and customer relationships.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Engineering Jobs. Just set your
preferences and Job Copilot will do the rest — finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!