Related skills
java aws python scala hadoopπ Description
- Design and build end-to-end data pipelines on AWS.
- Work on greenfield and enterprise-scale platforms.
- Ingest, transform, and serve data at scale in financial services.
- Collaborate with clients to define solutions and production systems.
- Apply AWS Well-Architected Principles for scalability and security.
- Lead development of robust, fault-tolerant data solutions.
π― Requirements
- Proficient in Python, Scala or Java with Spark/Hadoop.
- Real-time streaming: Kafka, Spark Streaming, or Kinesis.
- AWS cloud environments proficiency.
- Data Lakehouse and Data Warehousing architectures.
- CI/CD, DevOps tools, data governance incl. GDPR.
- Experience delivering solutions for highly regulated industries.
π Benefits
- CoreBenefits: bonus, pension, health and life cover.
- Mental health: CareFirst, Unmind, Aviva, in-house.
- Family leave: maternity, adoption, shared parental leave.
- 8 backup care sessions for emergency childcare/elder care.
- 5 weeks holiday with buy/sell option.
- 40 hours training annually; day-one coaching.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Engineering Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!