Related skills
java etl ci/cd data modeling elasticsearchπ Description
- Implement data processing pipelines using Kafka for real-time streaming.
- Optimize search capabilities with Elastic technologies.
- Collaborate with product managers, data analysts to translate requirements into technical specs.
- Oversee code reviews, ensure best practices in coding and data handling.
- Stay up-to-date with emerging trends and technologies in big data; propose improvements.
- Troubleshoot and resolve issues to minimize downtime and ensure system reliability.
π― Requirements
- Bachelor's in CS/Engineering; Master's preferred.
- 8+ years in software engineering.
- 5+ years with big data; Elastic and Kafka focus.
- Proficiency in Java and related frameworks.
- Strong data modeling, ETL, and data warehousing.
- Excellent problem-solving and analytical skills.
- Solid understanding of CI/CD principles.
- Experience with external and in-house APIs and SDKs.
π Benefits
- Equity participation - share in our success.
- Flexible work arrangements including remote days.
- Knowledge training and career development tracks.
- Docker and Kubernetes experience.
- Cloud platforms like AWS or Azure.
- Node.js experience.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!