Added
16 hours ago
Type
Full time
Salary
Salary not provided

Related skills

etl sql gcp databricks data modeling

πŸ“‹ Description

  • Design and implement scalable data models for BI, ML, and operations.
  • Choose schema designs (star, snowflake, normalized/denormalized) per use case.
  • Collaborate with business teams to translate data needs into models.
  • Promote SSOT across data layers and pipelines.
  • Maintain data consistency, traceability, and quality across sources.
  • Build and maintain batch and streaming ETL pipelines end-to-end.

🎯 Requirements

  • Bachelor's degree or higher in CS, IS, Finance, Math, or related field.
  • 5+ years designing and implementing ETL pipelines.
  • Experience with streaming data pipelines; Kafka or Flink.
  • SQL proficiency; MySQL/PostgreSQL/Oracle; data warehousing.
  • Familiarity with GCP and tools like Databricks and Airflow.
  • Knowledge of data governance and regulatory requirements in finance.

🎁 Benefits

  • Competitive salary plus equity.
  • Open office space with fully stocked kitchen.
  • Regular team-building events.
  • Freedom to be creative and make an impact.
Share job

Meet JobCopilot: Your Personal AI Job Hunter

Automatically Apply to Engineering Jobs. Just set your preferences and Job Copilot will do the rest β€” finding, filtering, and applying while you focus on what matters.

Related Engineering Jobs

See more Engineering jobs β†’