This job is no longer available

The job listing you are looking has expired.
Please browse our latest remote jobs.

See open jobs →
← Back to all jobs

Experienced Data Engineer - Streaming Platform

Added
20 days ago
Location
Type
Full time
Salary
Not Specified

Use AI to Automatically Apply!

Let your AI Job Copilot auto-fill application questions
Auto-apply to relevant jobs from 300,000 companies

Auto-apply with JobCopilot Apply manually instead
Save job

Founded in 2013, Voodoo is a tech company that creates mobile games and apps with a mission to entertain the world. Gathering 800 employees, 7 billion downloads, and over 200 million active users, Voodoo is the #3 mobile publisher worldwide in terms of downloads after Google and Meta. Our portfolio includes chart-topping games like Mob Control and Block Jam, alongside popular apps such as BeReal and Wizz.

Team

The Engineering & Data team builds innovative tech products and platforms to support the impressive growth of their gaming and consumer apps which allow Voodoo to stay at the forefront of the mobile industry.

Within the Data team, you’ll join the Ad-Network Team which is an autonomous squad of around 30 people. The team is composed of top-tier software engineers, infrastructure engineers, data engineers, mobile engineers, and data scientists (among which 3 Kaggle Masters). The goal of this team is to provide a way for Voodoo to monetize our inventory directly with advertising partners, and relies on advanced technological solutions to optimize advertising in a real-time bidding environment. It is a strategic topic with significant impact on the business.

This roles requires to be onsite 3 days/week and is Paris based.

Role

  • Build, maintain, and optimize real-time data pipelines to process bid requests, impressions, clicks, and user engagement data.
  • Develop scalable solutions using tools like Apache Flink, Spark Structured Streaming, or similar stream processing frameworks.
  • Collaborate with backend engineers to integrate OpenRTB signals into our data pipelines and ensure smooth data flow across systems.
  • Ensure data pipelines handle high-throughput, low-latency, and fault-tolerant processing in real-time.
  • Write clean, well-documented code in Java, Scala, or Python for distributed systems.
  • Work with cloud-native messaging and event platforms such as GCP Pub/Sub, AWS Kinesis, Apache Pulsar, or Kafka to ensure reliable message delivery.
  • Assist in the management and evolution of event schemas (Protobuf, Avro), including data consistency and versioning.
  • Implement monitoring, logging, and alerting for streaming workloads to ensure data integrity and system health.
  • Continuously improve data infrastructure for better performance, cost-efficiency, and scalability.
  • Profile (Must have)

  • 3-5+ years of experience in data engineering, with a strong focus on real-time streaming systems.
  • Familiarity with stream processing tools like Apache Flink, Spark Structured Streaming, Beam, or similar frameworks.
  • Solid programming experience in Java, Scala, or Python, especially in distributed or event-driven systems.
  • Experience working with event streaming and messaging platforms like GCP Pub/Sub, AWS Kinesis, Apache Pulsar, or Kafka.
  • Hands-on knowledge of event schema management, including tools like Avro or Protobuf.
  • Understanding of real-time data pipelines, with experience handling large volumes of event-driven data.
  • Comfortable working in Kubernetes for deploying and managing data processing workloads in cloud environments (AWS, GCP, etc.).
  • Exposure to CI/CD workflows and infrastructure-as-code tools such as Terraform, Docker, and Helm.
  • Nice to have

  • Familiarity with real-time analytics platforms (e.g., ClickHouse, Pinot, Druid) for querying large volumes of event data.
  • Exposure to service mesh, auto-scaling, or cost optimization strategies in containerized environments.
  • Contributions to open-source projects related to data engineering or stream processing.
  • Benefits

  • Competitive salary upon experience
  • Comprehensive relocation package (including visa support)
  • Swile Lunch voucher
  • Gymlib (100% borne by Voodoo)
  • Premium healthcare coverage SideCare, for your family is 100% borne by Voodoo
  • Child day care facilities (Les Petits Chaperons rouges)
  • Wellness activities in our Paris office
  • Unlimited vacation policy
  • Remote days
  • 🚚 Relocation support

    🛃 Visa sponsorship

    Use AI to Automatically Apply!

    Let your AI Job Copilot auto-fill application questions
    Auto-apply to relevant jobs from 300,000 companies

    Auto-apply with JobCopilot Apply manually instead
    Share job

    Meet JobCopilot: Your Personal AI Job Hunter

    Automatically Apply to Hybrid Data Jobs. Just set your preferences and Job Copilot will do the rest—finding, filtering, and applying while you focus on what matters.

    Related Data Jobs

    See more Data jobs →