Related skills
azure aws python gcp scala๐ Description
- Work on short to medium engagements using the Databricks platform.
- Provide data engineering, data science, and cloud projects integrating with client systems.
- Design reference architectures and productionize customer use cases.
- Collaborate with engagement managers to scope professional services work.
- Guide strategic customers as they implement big data projects and AI apps.
- Provide escalated support for customer operational issues.
๐ฏ Requirements
- 6+ years experience in data engineering, data platforms & analytics.
- Comfortable writing code in Python or Scala.
- Cloud knowledge across two or more ecosystems (AWS/Azure/GCP) with expertise in at least one.
- Deep experience with distributed computing using Apache Spark and Spark runtime internals.
- CI/CD for production deployments; working knowledge of MLOps.
- Design and deployment of end-to-end data architectures; ability to manage scope and timelines.
๐ Benefits
- Comprehensive benefits and perks.
- Pay range transparency and regional details on the benefits page.
- Diversity and inclusion commitments.
- Access to the Databricks benefits portal: https://www.mybenefitsnow.com/databricks
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Engineering Jobs. Just set your
preferences and Job Copilot will do the rest โ finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!