Related skills
aws kubernetes airflow kafka icebergπ Description
- Design, build, and operate critical data infrastructure platforms (lakehouse, replication, orchestration, and distributed compute)
- Ensure high availability, scalability, and performance of platform services supporting enterprise data workloads
- Contribute to infrastructure architecture decisions aligned with large-scale, modern data platform standards
- Partner closely with engineering, analytics, and product teams to support data platform adoption and migration initiatives
- Drive improvements in developer productivity through automation, tooling, and AI/agentic workflows
- Participate in on-call rotations and lead troubleshooting efforts for complex production issues
π― Requirements
- 5+ years of experience in infrastructure, platform engineering, or data engineering
- Experience working on highly cohesive engineering teams that collaborate in real time to solve complex challenges
- Strong background in infrastructure architecture within large-scale or enterprise environments
- Hands-on with cloud platforms (AWS) and container orchestration (Kubernetes)
- Experience with lakehouse architectures, data replication, and orchestration (Airflow, Kafka, Trino, Iceberg)
- Demonstrated experience using AI/agentic engineering tools to improve development efficiency
π Benefits
- Competitive year end performance bonus and equity package
- Full medical, dental, vision package to fit your needs
- Flexible vacation policy; work hard and take time when you need it
- Pet discount plans & retirement plan with company match (401K)
- The rare opportunity to work with sharp, motivated teammates solving some of the most unique challenges
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!