Related skills
databricks apache spark apache airflow pyspark delta lake๐ Description
- Design, develop, and maintain scalable ETL/ELT data pipelines.
- PySpark and Apache Spark for large-scale processing.
- Build and manage workflows using Apache Airflow.
- Develop and optimize Databricks solutions (Jobs, Delta Lake).
- Work with cloud data lakes (S3 or equivalent).
- Write efficient SQL for data transformation and analysis.
๐ฏ Requirements
- PySpark and Apache Spark internals.
- Databricks (Jobs, Delta Lake).
- Airflow for workflow orchestration.
- ETL/ELT pipelines at scale.
- Strong SQL and Data Warehouse (DWH) experience.
- Spark on EMR Serverless or managed Spark.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest โ finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!