Related skills
java terraform aws rabbitmq kafkaπ Description
- Design, develop, and maintain scalable real-time data pipelines.
- Own and run cloud-based infrastructure for reliability and scalability.
- Develop and manage data products with stakeholders.
- Mentor engineers, fostering a collaborative culture.
- Understand complex business domain across teams.
- Own features end-to-end from design to deployment.
π― Requirements
- 6+ years in DevOps, Data Eng, or Software Eng.
- Strong proficiency in Java.
- Experience with streaming platforms (Kafka, Kinesis, RabbitMQ).
- Solid understanding of distributed systems and trade-offs.
- Strong written and spoken English.
π Benefits
- IaC tools: Terraform, Pulumi.
- Cloud platforms: AWS, GCP, or Azure.
- Stream processing frameworks: Flink, Kafka Streams, Spark.
- Python experience.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!