Related skills
azure aws python gcp scalaπ Description
- Work on big data projects using the Databricks platform.
- Design reference architectures and productionize customer use cases.
- Scope professional services work with engagement managers.
- Guide end-to-end design, build and deployment of data and AI apps.
- Provide escalation support for customer operational issues.
- Collaborate with Databricks teams to ensure delivery meets customer needs.
π― Requirements
- 6+ years experience in data engineering, data platforms & analytics
- Comfortable coding in Python or Scala
- Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one
- Deep experience with Apache Spark and Spark runtime internals
- Familiarity with CI/CD for production deployments
- Travel to customers ~20% of the time
π Benefits
- Comprehensive benefits package; region details at mybenefitsnow.com
- Diverse, inclusive culture and global teams
- Remote-friendly and flexible work options
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Engineering Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!