Added
19 days ago
Type
Full time
Salary
Salary not provided

Related skills

aws sql python airflow kafka

πŸ“‹ Description

  • Design, develop, and maintain scalable real-time data pipelines.
  • Integrate data sources to enable seamless real-time data flow.
  • Build fault-tolerant, highly available data ingestion processes.
  • Monitor pipeline performance for low latency and high throughput.
  • Collaborate with software engineers, data scientists, and stakeholders on data infrastructure goals.

🎯 Requirements

  • Hands-on data engineering for real-time pipelines.
  • Distributed streaming: Kafka, Spark Streaming, Flink.
  • Cloud platforms: AWS, GCP, or Azure for real-time ingestion and storage.
  • Programming in Python, Java, or Scala.
  • SQL, NoSQL, time-series databases, and data modelling.
  • Airflow and Kubernetes for data pipeline orchestration.

🎁 Benefits

  • Competitive base salary
  • Company bonus scheme
  • Hybrid working (3 days in office)
  • Private health care (Bupa)
  • Travel Insurance
  • Annual company conference
Share job

Meet JobCopilot: Your Personal AI Job Hunter

Automatically Apply to Engineering Jobs. Just set your preferences and Job Copilot will do the rest β€” finding, filtering, and applying while you focus on what matters.

Related Engineering Jobs

See more Engineering jobs β†’