Related skills
snowflake python databricks dbt apache airflowπ Description
- Speak to and deliver solutions around Apache Airflow and Astronomer.
- Interact with customers to ensure successful Airflow post-sales implementation.
- Design and develop data pipelines across environments and use cases.
- Engage with the Apache Airflow project and open-source community.
- Maintain and advance your Airflow and Astronomer expertise with latest releases.
- Become a trusted advisor helping customers put their data in motion.
π― Requirements
- 3-5 years of data-engineering experience, ideally with Apache Airflow in production.
- Experience creating DAGs, Python, and Airflow operators/hooks.
- Experience implementing ETL/ELT pipelines, data transformations, monitoring.
- Experience with Cloudera, Databricks, Snowflake, DBT, AWS, Azure, or GCP.
- Strong verbal and written communicator with customers across mediums.
- Highly data-driven and customer-focused, with collaborative mindset.
π Benefits
- Equity component and comprehensive benefits package.
- Remote-friendly work environment.
- Opportunities to engage with the Airflow community.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!