Related skills
azure aws python gcp scalaπ Description
- Work on client engagements to solve big data challenges with Databricks.
- Deliver data engineering, data science, and cloud projects for clients.
- Design and productionalize reference architectures and use cases.
- Scope work with engagement managers and customer input.
- Guide end-to-end big data and AI deployments for customers.
- Provide escalated support for operational issues.
π― Requirements
- 6+ years experience in data engineering and analytics.
- Proficient in Python or Scala.
- Cloud ecosystems: AWS/Azure/GCP with depth in at least one.
- Deep experience with Apache Spark and Spark runtime internals.
- Familiarity with CI/CD for production deployments.
- Working knowledge of MLOps.
- Design and deployment of performant end-to-end data architectures.
- Experience delivering technical projects and managing scope.
π Benefits
- Pay ranges with Zone-based transparency; USD base pay and bonus potential.
- Eligibility for annual bonus, equity, and benefits.
- Global offices and remote working options.
- Commitment to diversity and inclusion.
- Region-specific benefits information available online.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Engineering Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!