Related skills
etl sql gcp databricks data modelingπ Description
- Design and implement scalable data models for BI, ML, and operations.
- Choose schema designs (star, snowflake, normalized/denormalized) per use case.
- Collaborate with business teams to translate data needs into models.
- Promote SSOT across data layers and pipelines.
- Maintain data consistency, traceability, and quality across sources.
- Build and maintain batch and streaming ETL pipelines end-to-end.
π― Requirements
- Bachelor's degree or higher in CS, IS, Finance, Math, or related field.
- 5+ years designing and implementing ETL pipelines.
- Experience with streaming data pipelines; Kafka or Flink.
- SQL proficiency; MySQL/PostgreSQL/Oracle; data warehousing.
- Familiarity with GCP and tools like Databricks and Airflow.
- Knowledge of data governance and regulatory requirements in finance.
π Benefits
- Competitive salary plus equity.
- Open office space with fully stocked kitchen.
- Regular team-building events.
- Freedom to be creative and make an impact.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Engineering Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!