Added
9 minutes ago
Type
Full time
Salary
Upgrade to Premium to se...

Related skills

jenkins aws python databricks airflow

πŸ“‹ Description

  • Architect and build distributed data processing systems using Python and Spark on AWS.
  • Design and implement end-to-end ETL/ELT workflows ingesting data from diverse sources.
  • Lead the Medallion Architecture across Bronze, Silver, Gold layers for scalability.
  • Build reusable libraries for data quality, metadata, and pipeline monitoring.
  • Create CI/CD processes to automate deployment and testing.
  • Enforce data governance, security, privacy, and regulatory compliance.

🎯 Requirements

  • 4+ years of professional data engineering experience building production-grade data platforms.
  • Expert Python and Apache Spark; JVM tuning, memory management, and optimized workloads.
  • Deep expertise in modern data architecture, data modeling for scalability.
  • Proven AWS (primary) or GCP experience with EMR or Databricks.
  • Experience with orchestration tools: Airflow, AWS Step Functions, or Prefect.
  • Must be located in EST or CST; unrestricted US work authorization; no sponsorship.

🎁 Benefits

  • Medical, dental, vision, and basic life insurance.
  • PTO and company paid holidays.
  • Retirement programs.
  • 1% charitable giving program.
Share job

Meet JobCopilot: Your Personal AI Job Hunter

Automatically Apply to Data Jobs. Just set your preferences and Job Copilot will do the rest β€” finding, filtering, and applying while you focus on what matters.

Related Data Jobs

See more Data jobs β†’