Related skills
java python distributed systems kafka apache beam๐ Description
- Design & Build: Develop logic to observe streaming metrics via Kafka I/O transforms.
- Work with Modern Tech: Use Apache Beam to create pipelines for massive scale.
- Integrate: Hook observations into the Integration Manager dashboard with the core engineering team.
- Test & Validate: Ensure monitoring doesn't break the pipes while maintaining high throughput.
๐ฏ Requirements
- Must be actively enrolled in a college degree program
- Must be legally authorized to work in the United States
- Java or Python: Strong proficiency in at least one of these (as Beam languages)
- Distributed Systems Concepts: Basics of data movement in a network
- The Data Mindset: Knowledge of Producer and Consumer in messaging (bonus if Kafka)
- Previous experience with Apache Beam or Google Cloud Dataflow
๐ Benefits
- Intern events throughout the 12 weeks
- Time with Executives: 1:1s and Q&A
- Workshops to help new professionals
- Travel for onsite orientation in Austin, TX
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Engineering Jobs. Just set your
preferences and Job Copilot will do the rest โ finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!