This job is no longer available

The job listing you are looking has expired.
Please browse our latest remote jobs.

See open jobs →
← Back to all jobs
Added
1 day ago
Location
Type
Full time
Salary
Not Specified

Use AI to Automatically Apply!

Let your AI Job Copilot auto-fill application questions
Auto-apply to relevant jobs from 300,000 companies

Auto-apply with JobCopilot Apply manually instead
Save job

Our client is a leading institutional investment platform in the digital asset space, providing comprehensive infrastructure and technology for investors to manage their entire digital asset investment lifecycle. Their platform integrates portfolio management, centralized and decentralized trading, risk management, and investment operations, enabling seamless engagement with both permissioned and permissionless decentralized finance (DeFi). Their mission is to build the necessary infrastructure that allows institutional investors to seamlessly integrate digital assets and DeFi into the existing financial system.

We’re looking for a skilled Data Engineer to design and implement high-throughput, real-time data pipelines that serve both trading systems and analytics infrastructure. You’ll work closely with a small, senior team to build systems that ingest, transform, and serve blockchain data at scale — especially on Solana.

If you're passionate about Rust, real-time systems, stream processing, and the crypto space (or want to be), this role is for you.

Key Responsibilities:

  • Build and maintain low-latency data pipelines using tools like Apache Flink, NATS, Kafka, or similar
  • Ingest and process blockchain data from Solana via Geyser or Yellowstone
  • Work with ClickHouse, RisingWave, or similar systems to support analytics and internal dashboards
  • Develop production-quality code in Rust and/or C++
  • Optimize pipelines for performance, reliability, and scalability
  • Collaborate with quant, infra, and trading teams to ensure systems meet performance requirements
  • Key Qualifications:

  • Strong experience with stream processing frameworks (e.g., Flink, Kafka, NATS)
  • Hands-on experience with Rust or C++ in production environments
  • Familiarity with data lake and analytics engines like ClickHouse, RisingWave, or Arroyo
  • Understanding of real-time or low-latency system architecture
  • Bonus: Experience with Solana or other blockchain infrastructure
  • Bonus: Familiarity with Geyser plugin system or Solana RPC indexing
  • Preferred Qualifications:

  • Experience with Rust
  • Experience with blockchain technologies
  • Familiarity with real-time data processing
  • Background in financial or trading systems
  • What We Offer:

  • Competitive salary and benefits package.
  • Opportunity to work with a passionate and innovative team.
  • Flexible working hours and remote work options.
  • Professional growth and development opportunities.
  • A collaborative and inclusive company culture.
  • Use AI to Automatically Apply!

    Let your AI Job Copilot auto-fill application questions
    Auto-apply to relevant jobs from 300,000 companies

    Auto-apply with JobCopilot Apply manually instead
    Share job

    Meet JobCopilot: Your Personal AI Job Hunter

    Automatically Apply to Remote Data Jobs. Just set your preferences and Job Copilot will do the rest—finding, filtering, and applying while you focus on what matters.

    Related Data Jobs

    See more Data jobs →