Related skills
bigquery looker snowflake sql dbtπ Description
- Design, build, and maintain production data models for incentives, promotions, lifecycle analytics.
- Collaborate with Data Engineering to model data from multiple systems and implement ELT patterns.
- Define KPI definitions and enable self-serve analytics with clean semantic models and LookML.
- Set data quality standards with automated testing, lineage, and monitoring for trusted dashboards.
- Collaborate with Product, Marketing, and Engineering to scope requirements and deliver high-impact datasets.
- Continuous improvement of pipeline performance, reliability, and cost efficiency; promote best practices.
π― Requirements
- 4+ years in analytics engineering, data engineering, or BI building production data models in a cloud stack.
- Advanced SQL proficiency (joins, window functions, optimization) with Snowflake, BigQuery, or Redshift.
- 2+ years implementing and maintaining dbt projects in production with Git workflows.
- Hands-on experience orchestrating ELT/ETL pipelines with Airflow, Dagster, or similar.
- Experience building semantic layers and BI models (Looker/LookML or equivalent) for self-serve analytics.
- Automated data quality testing and data observability, with ownership of documentation and lineage.
π Benefits
- Remote-first policy with flexible work locations.
- Equity grants and annual refresh grants.
- Flex First remote policy and Canada provincial eligibility.
- Competitive benefits and strong team collaboration.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Engineering Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!