Related skills
snowflake etl python kubernetes databricks📋 Description
- Speak to deliver solutions around Apache Airflow and Astronomer
- Interact with customers to ensure successful Airflow implementation
- Architect data pipelines across environments and use cases
- Engage with the Apache Airflow open-source project
🎯 Requirements
- 3-5 years of data engineering with Airflow in production
- Experience creating DAGs, Python, and Airflow operators/hooks
- ETL/ELT pipelines, data transformations, optimization, monitoring
- Databricks, Snowflake, DBT, and cloud (AWS) experience
- Empathetic, driven and team-oriented
- Strong prioritization and customer-facing execution
🎁 Benefits
- Hybrid work model: 3 days/week in Hyderabad
- Office in Hyderabad, India
- Opportunity to engage with the Apache Airflow community
- Equal opportunity employer
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Engineering Jobs. Just set your
preferences and Job Copilot will do the rest — finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!