Related skills
linux data pipeline python performance optimization multi-threadingπ Description
- Implement data downsampling while preserving business value
- Develop features to simplify data tracking for regulatory needs
- Optimize framework performance to run efficiently on vehicle
- Coordinate with cross-functional teams to add missing framework features
π― Requirements
- 7+ years of professional software experience and BS in CS or related field
- Strong proficiency in C++ and Python
- Experience with scalable data pipeline architecture
- Experience with Linux and systems programming
- AWS S3 object storage experience across storage tiers
- Petabyte-level data management
π Benefits
- Paid time off including sick, vacation, bereavement
- Unpaid time off
- Zoox Stock Appreciation Rights
- Amazon RSUs
- Health insurance
- Disability and life insurance
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!