Related skills
terraform grafana rabbitmq elasticsearch apache kafka📋 Description
- Build and maintain scalable, reliable data infrastructure across cloud and on-prem
- Partner with engineering, analytics, and observability teams to support data flow from ingestion to consumption
- Develop infrastructure as code and automation tools to improve deployment, monitoring, and recovery
- Help troubleshoot performance bottlenecks and reliability issues in large-scale pipelines
- Implement data quality processes aligned with security standards to ensure accuracy, consistency, and compliance with financial regulations and PCI requirements.
- Continuously improve data platform performance via tuning and benchmarking
🎯 Requirements
- 3+ years of experience in data engineering, infrastructure, or DevOps roles
- Strong hands-on experience with message queues: Apache Kafka, Confluent, RabbitMQ
- Elastic Stack experience (Elasticsearch, Logstash, Kibana) incl. pipelines and indexing
- Time-series DBs (Prometheus, InfluxDB) and observability with Grafana
- Cloud platforms (AWS, GCP, Azure) and production data infrastructure knowledge
- Infrastructure as code experience using Terraform or Ansible
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest — finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!