Related skills
linux aws hadoop airflow kafka📋 Description
- Define the multi-year roadmap for Stripe’s Batch Compute Infrastructure.
- Hire, mentor, and scale a high-performing engineering team.
- Maintain Tier-0 infrastructure for tens of thousands of daily workloads.
- Collaborate with data platform, finance, and product teams on cost metrics.
- Provide technical guidance in architecture reviews for Hadoop/Spark/AWS.
🎯 Requirements
- 10+ years of professional software development experience.
- 3+ years in direct engineering management of high-velocity teams.
- Deep background in large-scale distributed data systems (Hadoop, Spark, Kafka).
- Proven track record of cost optimization and capacity planning.
- Experience in cross-functional, global organizations.
- Preferred: remote team leadership; on-prem Hadoop; AWS S3.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Engineering Jobs. Just set your
preferences and Job Copilot will do the rest — finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!