Added
7 hours ago
Location
Type
Full time
Salary
Salary not provided
Related skills
java postgresql sql python kubernetes๐ Description
- Design, develop, and maintain scalable data pipelines for ingestion and processing
- Implement data integration to central data lake/warehouse from multiple sources
- Collaborate with data scientists to translate requirements into technical specs
- Ensure data quality with validation and cleansing processes
- Monitor pipelines, troubleshoot issues, and ensure reliability
- Maintain ETL processes using Spark, NiFi, or similar tools
๐ฏ Requirements
- Bachelor's degree in CS/CE or related field, or equivalent
- 3+ years of progressive data engineering experience
- Fluent in Java, Python, Scala; strong SQL and complex queries
- Experience with Airflow or similar orchestration tools
- Experience with Spark, Kafka, Flink for streaming
- Backend experience: Cassandra, MongoDB, Oracle, PostgreSQL; AWS/GCP
- Containerized environments: Kubernetes, Docker
- Experience building scalable microservices: REST, Spring Boot, GRPC
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest โ finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!