Related skills
java sql python hadoop apache spark📋 Description
- Identify and prioritize tasks in the software development life cycle.
- Collaborate with business teams to refine software requirements.
- Develop tools and apps with clean, efficient code.
- Automate tasks via tools and scripting.
- Analyze and debug systems.
- Keep software up-to-date with latest tech.
🎯 Requirements
- Experience with distributed computing tools: Hudi, Trino, Hadoop, Spark.
- Experience with distributed storage systems: HDFS, S3.
- Proficient in Java and Python; RESTful APIs; Git.
- SQL databases (SQL Server, MySQL) and data modeling.
- Experience with monitoring tools (New Relic, DataDog) and CI/CD.
- BSc/BA in Computer Science or related field; strong problem-solving.
🎁 Benefits
- Benefits starting from Day 1.
- Retirement plan matching.
- Flexible PTO and wellness programs.
- Parental and caregiver leaves.
- Fertility and adoption support.
- Continuous development and EAP programs.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Engineering Jobs. Just set your
preferences and Job Copilot will do the rest — finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!