Added
less than a minute ago
Location
Type
Full time
Salary
Salary not provided
Related skills
bigquery snowflake sql databricks dbtπ Description
- Lead requirements discovery and translate business questions into data/measurement solutions.
- Design data models (dimensional/star) to support scalable reporting and self-serve analytics.
- Model data across cloud warehouses (BigQuery, Snowflake, Databricks, Redshift).
- Build BI dashboards with Looker Studio, Power BI and Tableau.
- Establish data quality, monitoring, governance, and documentation practices.
- Partner with analytics teams on GA4/GTM tracking for usable downstream data.
π― Requirements
- 2+ years experience in data engineering / analytics engineering / BI engineering.
- At least 1 year experience in client-facing delivery, consultancy or agency.
- Expert SQL skills, primarily in ETL/ELT context; BigQuery first, Python favoured.
- Strong dimensional modeling (star schemas), warehouse design, and KPI definition for analytics.
- Experience with BigQuery and another cloud data platform (Snowflake, Databricks, Redshift).
- Experience building BI dashboards (Looker Studio, Power BI, Tableau).
π Benefits
- Growth opportunities within APAC data team with clear progression.
- Chance to innovate on analytics projects and product development.
- Potential to advance analytics and data practice at Monks.
- Access to Monks.Flow AI ecosystem and data tooling.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!