Related skills
java sql python scala hadoop๐ Description
- Design and develop complex data pipelines.
- Contribute to end-to-end delivery across enterprise-scale Big Data.
- Ensure data reliability and performance.
- Adhere to engineering best practices.
- Collaborate with cross-functional teams.
- Location: Pune/Bangalore.
๐ฏ Requirements
- 5โ9 years in Data Engineering on Data Warehouse/Data Lake/Lakehouse environments
- Spark development (Scala/Python/Java) on Hadoop or object storage
- Strong SQL capabilities (Oracle/Netezza/Hive or similar)
- Experience delivering large-scale data pipelines
- Agile/Scrum delivery experience
- NiFi pipelines experience
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest โ finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!