Related skills
java aws sql python scala๐ Description
- Take ownership, improve, scale, and iterate data pipelines.
- Design and implement new data processing pipelines.
- Collect and monitor performance metrics.
- Help discuss and implement security best practices.
๐ฏ Requirements
- Proficiency in at least one language: C++, Python, Scala, or Java.
- Experience with AWS (S3, EC2, IAM, EMR, Glue, Athena, Kinesis) or other clouds.
- Strong SQL skills and understanding of SQL engines.
- Understanding of distributed computing principles.
- Experience with a workflow tool (Airflow, Oozie, or Luigi).
- Strong analytical thinking; ability to justify technical decisions.
- Creative, resourceful, and innovative problem solver.
- Excellent English communication, written and spoken.
- Nice to have: Prior experience in mapping/navigation/automotive.
- Nice to have: Hands-on with data processing platforms (Spark or similar).
๐ Benefits
- Health care and parental leave
- Flexible work arrangements
- Learning and development opportunities
- Diverse, inclusive team culture
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest โ finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!