Related skills
looker aws python tableau apache spark๐ Description
- Oversee scalable data pipelines: design and maintain.
- Ensure data quality and governance for accuracy and compliance.
- ETL development: build and optimize pipelines ingesting and delivering data.
- Workflow orchestration: manage workflows with Apache Airflow.
- Leverage Trino/Presto and Apache Spark for data processing.
- Collaborate with engineering, product, and business teams to drive data decisions.
๐ฏ Requirements
- 6+ years in data management, governance, and pipelines.
- 5+ years with Airflow, Iceberg, and AWS (S3/Lambda/Redshift).
- 5+ years building ETL pipelines in Python.
- 5+ years with Trino/Presto and Apache Spark.
- Strong data modeling, warehousing concepts, and schema design.
- Leadership: mentor teams and drive data initiatives.
- Excellent communication with technical and non-technical stakeholders.
- Bonus: Looker or Tableau for visualization.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest โ finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!