Related skills
postgresql sql mysql databricks airflow📋 Description
- Design scalable data models to support BI, ML, and ops.
- Choose schemas (star, snowflake, denormalized) per use case.
- Collaborate with business teams to translate data needs.
- Promote SSOT across data layers and pipelines.
- Build and maintain batch and streaming ETL pipelines.
🎯 Requirements
- Bachelor’s degree or higher in CS, IS, Finance, Math, or related field.
- 7+ years designing and implementing ETL pipelines.
- Experience with streaming data pipelines (Kafka, Flink).
- SQL, MySQL, PostgreSQL, Oracle, and data warehousing.
- Familiarity with GCP and tools like Databricks and Airflow.
- Knowledge of data governance in finance.
🎁 Benefits
- Competitive salary plus equity.
- Open office space with fully stocked kitchen.
- Regular team-building events.
- Freedom to be creative and make an impact.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest — finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!