Related skills
snowflake python databricks apache airflow etl/eltπ Description
- Deliver Airflow solutions with Astronomer; engage customers and partners post-sale.
- Act as trusted advisor; design and develop data pipelines across environments.
- Engage with the Apache Airflow project and open-source community.
- Maintain expertise; stay current with Airflow releases and features.
- Educate customers on Airflow best practices; perform refactoring assessments.
- Collaborate to design, prototype, and implement engineering solutions.
π― Requirements
- 3-5 years of data-engineering experience with Airflow in production.
- Experience creating DAGs, Python, and Airflow operators/hooks.
- ETL/ELT pipelines experience; data transformations, optimization, monitoring.
- Experience with Databricks, Snowflake, DBT, and cloud platforms (AWS/Azure/Google Cloud).
- Empathetic, driven and team-oriented.
- Strong prioritization and ability to execute on customer-facing activities.
π Benefits
- Production ETL/ELT using Modern Data Stack.
- Experience building and maintaining CI/CD pipelines.
- Experience with short and long-term customer engagements.
- Experience with Docker and Kubernetes.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Engineering Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!