Data Engineer (Databricks + PySpark)

Added
11 hours ago
Type
Full time
Salary
Salary not provided

Related skills

databricks apache spark apache airflow pyspark delta lake

๐Ÿ“‹ Description

  • Design, develop, and maintain scalable ETL/ELT data pipelines.
  • PySpark and Apache Spark for large-scale processing.
  • Build and manage workflows using Apache Airflow.
  • Develop and optimize Databricks solutions (Jobs, Delta Lake).
  • Work with cloud data lakes (S3 or equivalent).
  • Write efficient SQL for data transformation and analysis.

๐ŸŽฏ Requirements

  • PySpark and Apache Spark internals.
  • Databricks (Jobs, Delta Lake).
  • Airflow for workflow orchestration.
  • ETL/ELT pipelines at scale.
  • Strong SQL and Data Warehouse (DWH) experience.
  • Spark on EMR Serverless or managed Spark.
Share job

Meet JobCopilot: Your Personal AI Job Hunter

Automatically Apply to Data Jobs. Just set your preferences and Job Copilot will do the rest โ€” finding, filtering, and applying while you focus on what matters.

Related Data Jobs

See more Data jobs โ†’