Related skills
aws postgresql redis databricks airflow๐ Description
- Building and maintaining our big-data pipelines
- Design and implement complex, high-scale systems using diverse tech
- Collaborate with engineers and data scientists on planning and maintenance
- Implement solutions in the AWS cloud environment, and work in Databricks with PySpark
๐ฏ Requirements
- Holds a BSc degree in Computer Science or equivalent practical experience
- You love building robust, fault-tolerant, and scalable systems and products
- Has at least 3+ years of server-side software development experience in C#, Go, Python, etc.
- Experience building large-scale web APIs; advantage for Microservices, AWS, and databases (Redis, PostgreSQL)
- Familiarity with Big Data technologies: Spark, Databricks, Airflow
- Worked in a cloud environment such as AWS or GCP
๐ Benefits
- Hybrid work model with flexibility
- Competitive compensation and benefits
- Opportunities for career growth and learning
- Inclusive, diverse workplace culture
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest โ finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!