Added
6 days ago
Location
Type
Full time
Salary
Salary not provided
Related skills
distributed systems hadoop airflow spark data pipelines📋 Description
- Design and build a universal data platform for real-time and batch needs.
- Define and implement data pipeline architectures for various scenarios.
- Continuously enhance the data platform for stability, flexibility, and efficiency.
🎯 Requirements
- Bachelor’s degree in CS, Data Science, Mathematics, or related field.
- 5+ years hands-on data development experience.
- Proficient in Hadoop, Spark, Flink, Airflow; strong performance optimization.
- Solid knowledge of distributed systems, computation, and storage.
- Excellent communication, logical thinking, self-motivation, and learning.
- Experience/knowledge in the accounting domain is a plus.
🎁 Benefits
- Competitive salary
- Annual leave including birthday and work anniversary
- Flexi-work hybrid or remote setup
- Internal mobility program offering diverse career scope
- Crypto.com visa card provided on joining
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest — finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!