Added
less than a minute ago
Type
Full time
Salary
Salary not provided

Related skills

terraform scala airflow spark google cloud platform

๐Ÿ“‹ Description

  • Build scalable data pipelines with Cloud Composer and Airflow
  • Design streaming pipelines using Pub/Sub, Dataflow, Spark on GCP
  • Design, build, and launch scalable data warehouses and pipelines for partners
  • Build data quality checks to meet security/compliance
  • Participate in code and design reviews
  • Mentor junior engineers and improve team processes

๐ŸŽฏ Requirements

  • 5+ years server-side development in Scala/Java/Go/Python
  • 5+ years building large-scale, high-volume services
  • 5+ years managing data pipelines with Dataflow, Spark, Hadoop, Flink, Airflow
  • End-to-end SDLC experience including CI/CD (CircleCI/Jenkins/TravisCI) to production
  • Ability to explain technical content via Technical Design Documents
  • Mentors junior engineers

๐ŸŽ Benefits

  • Medical and Dental Coverage
  • Retirement Plan
  • Commuter Benefits
  • Wellness perks
  • Paid Time Off (Vacation, Sick, Baby Bonding, Cultural Observance)
  • Education Perks
  • Paid Gift Week in December
Share job

Meet JobCopilot: Your Personal AI Job Hunter

Automatically Apply to Data Jobs. Just set your preferences and Job Copilot will do the rest โ€” finding, filtering, and applying while you focus on what matters.

Related Data Jobs

See more Data jobs โ†’