Related skills
sql kubernetes apache spark data pipelines scylladb๐ Description
- Design and build APIs and backend services with Spring Boot for data products.
- Develop data pipelines and ETL using Apache Spark and Airflow.
- Create scalable, low latency APIs; collaborate across teams.
- Partner with ML and other teams to deliver integrated solutions.
๐ฏ Requirements
- 8+ years in software and/or data engineering with Spark/Airflow.
- Bachelor's degree in CS/Engineering or related field (or equivalent).
- Expertise in distributed systems and SOLID principles.
- Design and implement low-latency APIs with Spring Boot.
- Proficient in Python/Java/Scala; strong programming skills.
- Experience with AWS/GCP/Azure and Docker/Kubernetes.
- Advanced SQL, data warehousing, large-scale query optimization.
- AdTech data platforms experience preferred; financial data platform preferred.
๐ Benefits
- Hybrid schedule: in-office Mon-Thu, remote on Fridays.
- Global mental health and financial wellness resources.
- Medical, dental, vision, life, disability, and retirement options.
- Generous vacation and personal time off.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Engineering Jobs. Just set your
preferences and Job Copilot will do the rest โ finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!