Added
less than a minute ago
Type
Full time
Salary
Salary not provided

Related skills

java postgresql mongodb python scala

πŸ“‹ Description

  • Influence decisions on scalable data workflows architecture
  • Build big data pipelines with Spark and Airflow
  • Improve performance, throughput, and latency across systems
  • Enhance test automation and end-to-end production ownership
  • Implement data observability for quality and reliability
  • Participate in on-call rotation

🎯 Requirements

  • BS/BA in computer science or equivalent experience.
  • 1-3 years of software development with production-level code
  • Proficiency in Python, Java, or Scala; SQL knowledge
  • Experience with Postgres and MongoDB
  • Experience with AWS or GCP cloud services
  • Spark, Airflow, Hadoop, Kafka data processing knowledge
  • Strong algorithms, Unix/Linux skills, and ability to communicate data-driven insights

🎁 Benefits

  • 25 days paid vacation
  • Private medical insurance for you and your family
  • FitPass – gyms across Serbia
  • Hybrid work schedule: in-office Tue-Thu
  • Growth Investment Program
  • Tech setup – company laptop provided
Share job

Meet JobCopilot: Your Personal AI Job Hunter

Automatically Apply to Data Jobs. Just set your preferences and Job Copilot will do the rest β€” finding, filtering, and applying while you focus on what matters.

Related Data Jobs

See more Data jobs β†’