Added
less than a minute ago
Type
Full time
Salary
Upgrade to Premium to se...

Related skills

java aws python hadoop kafka

๐Ÿ“‹ Description

  • Build and administer data workflows in IMCโ€™s modern big data environment.
  • Develop in-house data tools using Python and Java.
  • Build data pipelines with Hadoop, Spark, Kafka, and AWS.
  • Work with trading and research teams to develop data solutions.
  • Improve the performance of financial analytics on the big data stack.
  • Partial telecommuting permitted.

๐ŸŽฏ Requirements

  • Bachelor's degree in CS/engineering/IT or quantitative field.
  • Six months of data engineering in a high-throughput big data ecosystem.
  • Build and administer data workflows with Hadoop, Spark, SQL; troubleshoot.
  • Develop streaming data apps using Kafka.
  • Develop data solutions in Docker or Kubernetes using Java or Python.
  • Unix scripting (Bash, tcsh, zsh, Python).

๐ŸŽ Benefits

  • Discretionary bonus and benefits package.
  • Paid leave and insurance.
  • Collaborative, high-performance culture.
  • Global team across US, Europe, Asia Pacific.
Share job

Meet JobCopilot: Your Personal AI Job Hunter

Automatically Apply to Engineering Jobs. Just set your preferences and Job Copilot will do the rest โ€” finding, filtering, and applying while you focus on what matters.

Related Engineering Jobs

See more Engineering jobs โ†’