Added
less than a minute ago
Location
Type
Full time
Salary
Upgrade to Premium to se...
Related skills
aws sql data infrastructure scala spark๐ Description
- Design and implement robust data infra in AWS using Spark with Scala
- Evolve core data pipelines to scale for growth
- Store data in optimal engines and formats balancing performance and cost
- Collaborate with cross-functional teams to design data solutions
- Design and implement knowledge graphs with Batch Processing and APIs
- Leverage and optimize AWS resources for scale
๐ฏ Requirements
- Production data engineering experience
- Spark and Scala proficiency for data infrastructure
- Experience delivering large-scale services
- APIs backed by relationship-heavy datasets
- Familiarity with data lakes, cloud warehouses
- Strong proficiency in AWS services
- SQL expertise for data manipulation and extraction
- Bachelor's degree in Computer Science or related field
๐ Benefits
- Flexible work options via PinFlex
- Equity alongside base compensation
- Inclusive, equitable workplace culture
- Opportunities to influence data infra at scale
- Growth and learning opportunities
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest โ finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!