Related skills
azure aws python gcp databricks๐ Description
- Design and deliver data architecture on the Databricks platform.
- Build reference architectures and productionize use cases.
- Scope professional services work with engagement managers.
- Guide customers through end-to-end big data and AI deployments.
- Provide escalation support for operational issues.
- Collaborate with Databricks teams to ensure delivery.
๐ฏ Requirements
- Extensive experience in data engineering and analytics.
- Proficient in Python or Scala.
- Cloud experience across AWS, Azure, or GCP.
- Deep experience with Apache Spark and Spark internals.
- Familiar with CI/CD for production deployments.
- Travel to customers up to 10%.
๐ Benefits
- Comprehensive benefits tailored to region.
- Collaborative, customer-focused culture.
- Career growth in data and AI at scale.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest โ finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!