Added
6 days ago
Type
Full time
Salary
Upgrade to Premium to se...

Related skills

snowflake databricks rag iceberg lakehouse

πŸ“‹ Description

  • Establish standards for schemas, naming, events, and data representation.
  • Define data persistence across OLTP/OLAP/lakehouse and feature stores.
  • Set best practices for partitioning, indexing, storage tiering, and lifecycle.
  • Drive cost reduction via architectural optimization and deduplication.
  • Lead technology evaluations and decision frameworks (Databricks vs Snowflake, Delta vs Iceberg).
  • Define when to use batch, micro-batch, streaming, or event-driven architectures.

🎯 Requirements

  • 10+ years in data architecture, 5+ in multi-domain environments.
  • Deep expertise in multi-layer persistence and platforms: Snowflake, Databricks, Delta, Hudi, Iceberg.
  • Strong knowledge of partitioning, sharding, storage tiering, and workload isolation.
  • Experience enabling production AI/ML, including semantic layers, feature stores, and RAG.
  • Hands-on fluency with tables, events, APIs, and data structures aligned to business models.
  • Experience evaluating tech on cost, performance, scalability, governance, and ops.
  • Crisp communication across technical and executive audiences; healthcare data standards knowledge (FHIR, USCDI) and HIPAA familiarity.

🎁 Benefits

  • Benefits starting from Day 1
  • Retirement Plan Matching
  • Flexible Paid Time Off
  • Wellness Support Programs and Resources
  • Parental & Caregiver Leaves
  • Fertility & Adoption Support
Share job

Meet JobCopilot: Your Personal AI Job Hunter

Automatically Apply to Engineering Jobs. Just set your preferences and Job Copilot will do the rest β€” finding, filtering, and applying while you focus on what matters.

Related Engineering Jobs

See more Engineering jobs β†’