Related skills
terraform scala airflow spark google cloud platform๐ Description
- Build scalable data pipelines with Cloud Composer and Airflow
- Design streaming pipelines using Pub/Sub, Dataflow, Spark on GCP
- Design, build, and launch scalable data warehouses and pipelines for partners
- Build data quality checks to meet security/compliance
- Participate in code and design reviews
- Mentor junior engineers and improve team processes
๐ฏ Requirements
- 5+ years server-side development in Scala/Java/Go/Python
- 5+ years building large-scale, high-volume services
- 5+ years managing data pipelines with Dataflow, Spark, Hadoop, Flink, Airflow
- End-to-end SDLC experience including CI/CD (CircleCI/Jenkins/TravisCI) to production
- Ability to explain technical content via Technical Design Documents
- Mentors junior engineers
๐ Benefits
- Medical and Dental Coverage
- Retirement Plan
- Commuter Benefits
- Wellness perks
- Paid Time Off (Vacation, Sick, Baby Bonding, Cultural Observance)
- Education Perks
- Paid Gift Week in December
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest โ finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!