Added
less than a minute ago
Location
Type
Full time
Salary
Upgrade to Premium to se...
Related skills
terraform aws postgresql python fastapiπ Description
- Architect, implement, and maintain data ingestion and transformation pipelines (Dagster).
- Identify, catalog, and integrate internal and external data sources.
- Operationalize bioinformatics pipelines for large-scale AWS batch processing.
- Normalize and structure heterogeneous data for downstream analysis.
- Collaborate with backend and AI engineers on data-access patterns.
- Write clean, tested Python code and maintain production standards.
π― Requirements
- BS in CS, Bioinformatics, Computational Biology, or related field; MS preferred.
- 4+ years of production data or software engineering experience.
- Strong Python skills; production-quality code across data/backend contexts.
- Experience with workflow orchestration (Dagster preferred; Airflow/Prefect OK).
- Hands-on AWS experience (S3, ECS, Batch, Lambda) and deploying large-scale pipelines.
- Experience with Postgres/MySQL and/or Neo4j; data modeling and query design.
π Benefits
- Competitive benefits including medical, dental, vision, life and disability.
- Free testing for employees and fertility care benefits.
- Parental leave options including pregnancy and bonding leave.
- 401k retirement plan and commuter benefits.
- Employee referral program.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!