Related skills
hadoop airflow spark flink๐ Description
- Design and build a universal data platform for real-time and batch processing.
- Define data pipelines tailored to business scenarios using diverse infra.
- Continuously enhance data platform for stability, flexibility, and efficiency.
๐ฏ Requirements
- BS/BA in CS, Data Science, Math, or related field; 5+ years in data development.
- Proficient in Hadoop, Spark, Flink, Airflow; strong performance optimization.
- Strong knowledge of distributed systems, compute and storage architectures.
- Excellent communication, logical thinking, self-motivation, and continuous learning.
- Accounting domain experience is a plus.
๐ Benefits
- Competitive salary.
- Annual leave including birthday and work anniversary.
- Flexi-work hours with hybrid or remote set-up.
- Internal mobility program offering diverse opportunities.
- Crypto.com visa card provided upon joining.
- Benefits vary by region; details from talent acquisition team.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest โ finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!