Added
32 minutes ago
Type
Full time
Salary
Salary not provided

Related skills

java aws etl sql python

πŸ“‹ Description

  • Design and build scalable, secure data pipelines (streaming and batch).
  • Deliver production-grade data solutions across cloud and on-prem.
  • Collaborate with cross-functional teams on data transformation initiatives.
  • Lead CI/CD, testing, and automation practices.
  • Contribute to architecture discussions and cloud migration strategies.

🎯 Requirements

  • Hands-on experience with Python, Java, or Scala.
  • Proficiency in AWS cloud, Spark, Hadoop, and Airflow.
  • Strong SQL, ETL/ELT, and data modelling skills.
  • Experience building CI/CD pipelines (Jenkins or CircleCI).
  • Knowledge of data security and distributed system design.
  • Bonus: Kafka, Spark Streaming, or Kinesis familiarity.

🎁 Benefits

  • Core Benefits: discretionary bonus, pension, health, life, and critical illness.
  • Mental Health: Easy access to CareFirst, Unmind, Aviva consultations.
  • Family-Friendly: Maternity, shared parental leave, paid leave options.
  • Family Care: 8 backup care sessions for childcare.
  • Holiday Flexibility: 5 weeks annual leave with buy/sell options.
  • Continuous Learning: 40 hours of training yearly and coaching.
Share job

Meet JobCopilot: Your Personal AI Job Hunter

Automatically Apply to Engineering Jobs. Just set your preferences and Job Copilot will do the rest β€” finding, filtering, and applying while you focus on what matters.

Related Engineering Jobs

See more Engineering jobs β†’