Added
8 minutes ago
Type
Full time
Salary
Upgrade to Premium to se...

Related skills

aws python databricks ci/cd airflow

πŸ“‹ Description

  • Architect and build distributed data processing systems with Python, Spark on AWS
  • Design end-to-end ETL/ELT workflows across diverse sources
  • Lead the Medallion Architecture with Bronze/Silver/Gold layers for scalability
  • Build reusable libraries for data quality, metadata, and pipeline monitoring
  • Build CI/CD processes to automate deployment and testing
  • Enforce data governance, security, privacy, and regulatory compliance

🎯 Requirements

  • 4+ years of data engineering experience building production-grade platforms
  • Expert Python and Apache Spark; JVM tuning and memory management
  • Deep expertise in modern data architecture, patterns, and scalable modeling
  • Proven AWS (primary) or GCP with EMR or Databricks
  • Experience with Airflow, AWS Step Functions, or Prefect for data lifecycles
  • Data governance, security, and lifecycle management policies

🎁 Benefits

  • Medical, dental, vision, life insurance
  • Flexible PTO and company holidays
  • Retirement programs
  • 1% charitable giving program
Share job

Meet JobCopilot: Your Personal AI Job Hunter

Automatically Apply to Data Jobs. Just set your preferences and Job Copilot will do the rest β€” finding, filtering, and applying while you focus on what matters.

Related Data Jobs

See more Data jobs β†’