Related skills
docker kubernetes airflow kafka spark๐ Description
- Manage and monitor distributed systems, storage, and data pipelines (HDFS, Kubernetes, Dremio).
- Focus on systems automation and CI/CD for rapid deployment of hardware/software.
- Collaborate with systems/network engineers, traders, and developers to support queries.
- Stay updated on tech trends; propose and implement innovative solutions.
๐ฏ Requirements
- 5โ7 years managing large-scale multi-petabyte data infra.
- Advanced Linux admin and troubleshooting expertise.
- Deep expertise in Kafka, Spark, Cassandra/Scylla, or HDFS.
- Strong Docker, Kubernetes, and Helm skills.
- Experience with Dremio and Presto data access.
- Familiarity with Airflow and Prefect for workflows.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest โ finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!