Added
13 hours ago
Type
Full time
Salary
Upgrade to Premium to se...
Related skills
aws postgresql mongodb python kubernetesπ Description
- Design and evolve large-scale data platform architecture
- Design and operate data pipelines processing millions of events daily
- Define best practices for data ingestion, transformation, processing, storage
- Mentor team and grow knowledge of big data practices
- Work on Python API services (Django) to serve AI infra
- Build and maintain Airflow DAGs; optimize MongoDB and PostgreSQL
π― Requirements
- Strong Python skills with APIs for AI/ML teams
- Experience with Airflow, Kubernetes, and AWS
- Experience with ML infra and large language models
- Development experience with Terraform
- Experience working in startup environments
- Experience with regular on-call rotation
π Benefits
- Remote with hubs in San Francisco, Los Angeles, and NYC
- Medical/dental/vision coverage; 401(k); wellness, commuter, and FSA stipends
- Equity and 401K included
- Bi-annual in-person off-sites in unique locations
- Visa sponsorship not available
- Equal opportunity employer with an inclusive culture
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!