Related skills
docker aws kubernetes gcp kafkaπ Description
- Design, develop, deploy, monitor, operate and maintain existing or new elements of our platform
- Work with technologies such as Apache Kafka, Flink, Beam to enhance streaming offerings
- Collaborate with cross-functional teams to integrate streaming solutions into the broader data architecture
- Analyze internal systems and processes to identify improvements and automation opportunities
- Collaborate with product stakeholders to address and prioritize custom edge cases
- Operate a real-time streaming platform with high availability and low downtime for latency-sensitive workloads
π― Requirements
- 3+ years of experience building and developing large-scale infrastructure, distributed systems or networks, and/or experience with compute technologies
- Familiarity with streaming systems like Kafka, Flink, Beam, Spark Streaming, or similar technologies
- Experience with Kubernetes and container technologies (e.g. Docker, cri-o, etc)
- Familiar with cloud environments such as AWS/GCP/Azure
- Comfortable working in a fast-paced, dynamic environment
π Benefits
- Extended health and dental coverage options, along with life insurance and disability benefits
- Mental health benefits
- Family building benefits
- Access to a Lyft funded Health Care Savings Account
- RRSP plan to help save for your future
- Hybrid schedule with up to 4 weeks remote per year
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!