Added
less than a minute ago
Location
Type
Full time
Salary
Upgrade to Premium to se...
Related skills
bigquery docker postgresql python kubernetesπ Description
- Design, build, and maintain DB schemas for transactional and time-series workloads.
- Architect and manage scalable data pipelines from ingestion to storage.
- Build and operate a GCP data warehouse; ensure scalable, well-organized data.
- Develop and maintain APIs for internal services and external consumers.
- Implement real-time and streaming data solutions for high throughput.
- Own data quality, monitoring, and alerting across the data layer.
π― Requirements
- 5+ years of professional software engineering focusing on DB and data infra
- Deep PostgreSQL expertise: schema design, indexing, performance
- Hands-on time-series DB experience (TimescaleDB/TigerDB)
- Python and/or C++ for pipelines/backend
- Experience building/operating data pipelines and warehouses
- Docker/Kubernetes; GCP data products (BigQuery, Pub/Sub)
π Benefits
- Medical, dental, and vision benefits
- Life insurance and disability insurance
- 401(k) with company contribution
- Paid parental leave
- Fertility and infertility benefits
- Industry-competitive PTO and learning opportunities
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!