Related skills
aws python gcp airflow sparkπ Description
- Develop features for a schedule-based processing framework using Airflow, EMR, and DuckDB.
- Improve stability, performance, and scalability of data ingestion and processing for geofence/robotoxi deployment.
- Collaborate with software engineers, data scientists, data engineers, and TPMs to design robust architectures.
- Partner with Staff and Senior engineers to translate user pain points into solutions and roadmap items.
- Enhance observability by building monitoring and alerting tools to track performance and success.
π― Requirements
- BS or MS in Computer Science or related field with 4+ years of software engineering experience.
- Strong background in Python for large-scale data processing.
- Familiarity with Spark, Trino, and DuckDB.
- Experience with cloud services AWS, GCP, or Azure.
- Strong experience troubleshooting data pipelines and optimizing performance and cost.
π Benefits
- Health, long-term care, and life insurance.
- Long-term and short-term disability coverage.
- Amazon RSUs and Zoox stock rights.
- Paid time off (sick leave, vacation, bereavement).
- Sign-on bonus may be offered.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Engineering Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!