Related skills
docker terraform linux bash python๐ Description
- Be scrappy to find new audio data sources and ingest them.
- Operate and extend ingestion pipeline cloud infra on GCP, managed with Terraform.
- Collaborate with Scientists to improve data cost, throughput, and quality at scale.
- Collaborate with AI Team and leadership to craft dataset roadmap.
๐ฏ Requirements
- BS/MS/PhD in Computer Science or a related field.
- 5+ years of industry experience in software development.
- Proficiency with bash/Python scripting in Linux environments
- Proficiency in Docker and Infrastructure-as-Code concepts and professional experience with at least one major Cloud Provider (we use GCP)
- Experience with web crawlers, large-scale data processing workflows is a plus
- Ability to handle multiple tasks and adapt to changing priorities.
๐ Benefits
- A fast-growing environment where you can help shape the company and product.
- An entrepreneurial-minded team that supports risk, intuition, and hustle.
- A hands-off management approach so you can focus and do your best work.
- An opportunity to make a big impact in a transformative industry.
- Competitive salaries and a friendly, asynchronous culture.
- Opportunity to work on a life-changing product that millions of people use.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest โ finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!