Senior Data Engineer, Platform & Pipelines

Added
less than a minute ago
Type
Full time
Salary
Upgrade to Premium to se...

Related skills

terraform aws postgresql python fastapi

πŸ“‹ Description

  • Architect, implement, and maintain data ingestion and transformation pipelines (Dagster).
  • Identify, catalog, and integrate internal and external data sources.
  • Operationalize bioinformatics pipelines for large-scale AWS batch processing.
  • Normalize and structure heterogeneous data for downstream analysis.
  • Collaborate with backend and AI engineers on data-access patterns.
  • Write clean, tested Python code and maintain production standards.

🎯 Requirements

  • BS in CS, Bioinformatics, Computational Biology, or related field; MS preferred.
  • 4+ years of production data or software engineering experience.
  • Strong Python skills; production-quality code across data/backend contexts.
  • Experience with workflow orchestration (Dagster preferred; Airflow/Prefect OK).
  • Hands-on AWS experience (S3, ECS, Batch, Lambda) and deploying large-scale pipelines.
  • Experience with Postgres/MySQL and/or Neo4j; data modeling and query design.

🎁 Benefits

  • Competitive benefits including medical, dental, vision, life and disability.
  • Free testing for employees and fertility care benefits.
  • Parental leave options including pregnancy and bonding leave.
  • 401k retirement plan and commuter benefits.
  • Employee referral program.
Share job

Meet JobCopilot: Your Personal AI Job Hunter

Automatically Apply to Data Jobs. Just set your preferences and Job Copilot will do the rest β€” finding, filtering, and applying while you focus on what matters.

Related Data Jobs

See more Data jobs β†’