Related skills
docker terraform linux bash python๐ Description
- Be scrappy to source new audio data for ingestion pipeline
- Operate and extend the ingestion pipeline cloud infra on GCP with Terraform
- Collaborate with scientists to optimize cost, throughput, and data quality
- Work with the AI team and leadership to craft the dataset roadmap for next-gen products
๐ฏ Requirements
- BS/MS/PhD in Computer Science or related field.
- 5+ years of industry software development experience
- Proficiency with bash/Python scripting in Linux environments
- Proficiency in Docker and Infrastructure-as-Code concepts and experience with GCP
- Ability to handle multiple tasks and adapt to changing priorities
- Strong communication skills, both written and verbal
๐ Benefits
- A fast-growing environment where you can help shape the company and product.
- An entrepreneurial-minded team that supports risk, intuition, and hustle.
- A hands-off management approach so you can focus and do your best work.
- An opportunity to make a big impact in a transformative industry.
- Competitive salaries, a friendly and laid-back atmosphere, and a commitment to building a great asynchronous culture.
- Opportunity to work on a life-changing product that millions of people use.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest โ finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!