This job is no longer available

The job listing you are looking has expired.
Please browse our latest remote jobs.

See open jobs →
← Back to all jobs

Software Engineer - Data Stack - Pipelines

Added
less than a minute ago
Type
Full time
Salary
Not Specified

Use AI to Automatically Apply!

Let your AI Job Copilot auto-fill application questions
Auto-apply to relevant jobs from 300,000 companies

Auto-apply with JobCopilot Apply manually instead
Save job

About PostHog

We equip every developer to build successful products.
We started with open-source product analytics, launched out of Y Combinator's W20 cohort.

We've since shipped more than a dozen products, including a built-in data warehouse, a customer data platform, and PostHog AI, an AI-powered analyst that answers product questions, helps users find useful session recordings, and writes custom SQL queries.

Next on the roadmap are messaging, customer analytics, ai task creation and coding based on customer data, logs and support analytics.

Our values are not a poster on the wall full of aspiration. They’ve come from how we really work, day in day out.

PostHog is Open Source, product led, and a default alive company that is well funded.

What you will be doing

As a Software Engineer - Pipelines on the Data Stack team, you’ll build and iterate on our data import system.
Our import workers are built in python and we pull in data from APIs and databases in batches, process the data using Apache Arrow in memory, and move the data into object storage in open table formats.
You’ll build and maintain our source library, as we’re looking for creative ways to make our library manageable at scale. You’ll revamp our schema management strategy, and build resilient systems (e.g logging, observability, testing)

You’ll debug stateful data workflows by digging into k8s pod metrics, and schedule jobs using Temporal.io. As you can see, there’s a huge breadth of challenges and opportunities to tackle, and nothing is off-limits.

The PostHog Data Stack is both a core product for our users and a foundational platform for our internal teams. Data is a first-class product at PostHog, not an afterthought.

You will have the chance to push the boundaries of what our Data Stack team can do while ensuring we remain stable and production-ready.

You now know what you’ll be doing, but what about what you’ll need to bring along?

You’ll fit right in if:

  • You’re a builder. You bring strong skills in building resilient systems, with experience in Kubernetes, Docker, and S3 at scale. We build in python. Async-python and Temporal.io skills are welcome.

  • You have hands-on experience with batch processing and modern data formats. We use Arrow to stream data. Experience with Iceberg and/or Delta is welcome, we don’t expect you to have experience with all three (although that would be great)

  • You're more than a connector of things. Building pipelines is more than configuring tools to make them work together, it's about actually building the tooling used in data warehousing pipelines. We need you to have experience with building tools versus using off-the-shelf tools

  • You bring experience with creating and maintaining data pipelines. You are comfortable with debugging stateful, async data workflows by digging into k8s pod metrics.

  • You bring a mix of skills. It’s not just about the Data Pipeline work. You’ll need strong backend skills as we run a complex system.

  • You love getting things done. Engineers at PostHog have an incredible amount of autonomy to decide what to work on, so you’ll need to be proactive and just git it done.

  • You’re ready to do the best work of your career. We have incredible distribution, a big financial cushion and an amazing team. There’s probably no better place to see how far you can go.

If this sounds like you, we should talk.

We are committed to ensuring a fair and accessible interview process. If you need any accommodations or adjustments, please let us know

What’s in it for you?

Now that we've told you what you'll be building with us, let's talk about what we'll be building for you.

Use AI to Automatically Apply!

Let your AI Job Copilot auto-fill application questions
Auto-apply to relevant jobs from 300,000 companies

Auto-apply with JobCopilot Apply manually instead
Share job

Meet JobCopilot: Your Personal AI Job Hunter

Automatically Apply to Remote Engineering Jobs. Just set your preferences and Job Copilot will do the rest—finding, filtering, and applying while you focus on what matters.

Related Engineering Jobs

See more Engineering jobs →