Related skills
java postgresql mongodb python scalaπ Description
- Influence decisions on scalable data workflows architecture
- Build big data pipelines with Spark and Airflow
- Improve performance, throughput, and latency across systems
- Enhance test automation and end-to-end production ownership
- Implement data observability for quality and reliability
- Participate in on-call rotation
π― Requirements
- BS/BA in computer science or equivalent experience.
- 1-3 years of software development with production-level code
- Proficiency in Python, Java, or Scala; SQL knowledge
- Experience with Postgres and MongoDB
- Experience with AWS or GCP cloud services
- Spark, Airflow, Hadoop, Kafka data processing knowledge
- Strong algorithms, Unix/Linux skills, and ability to communicate data-driven insights
π Benefits
- 25 days paid vacation
- Private medical insurance for you and your family
- FitPass β gyms across Serbia
- Hybrid work schedule: in-office Tue-Thu
- Growth Investment Program
- Tech setup β company laptop provided
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!