Related skills
azure aws python gcp scala📋 Description
- Work with clients on short to medium engagements on big data with Databricks.
- Deliver data engineering, data science, and cloud projects; train users.
- Ensure projects meet specs with excellent customer service.
- Report to the regional Manager/Lead.
- Collaborate with teams to ensure engagement delivery.
🎯 Requirements
- 6+ years in data engineering, data platforms & analytics
- Python or Scala programming
- Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one
- Deep experience with Apache Spark and Spark runtime internals
- CI/CD for production deployments
- Travel to customers 20% of the time
🎁 Benefits
- Zone pay range: $161,280 – $225,792 USD
- Eligible for bonus, equity, and benefits
- Pay zones map by region (see pay documentation)
- Comprehensive benefits per region (see mybenefitsnow)
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Engineering Jobs. Just set your
preferences and Job Copilot will do the rest — finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!