Added
12 days ago
Location
Type
Full time
Salary
Salary not provided
Related skills
bigquery looker sql python airflow๐ Description
- Design and build the DWH data platform for analytics.
- Shape Loomi Analytics Agent data layer and capabilities.
- Co-build dashboards and analytics stack on DWH-based platforms.
- Build robust data pipelines to move Engagement data into DWHs.
- Implement batch and streaming ingestion patterns for scalability.
- Lead data models and orchestration via Airflow/Cloud Composer.
๐ฏ Requirements
- Data engineering background with SQL and data modeling.
- Production pipelines on GCP with BigQuery, Iceberg, Spark on DataProc.
- Orchestration with Airflow/Cloud Composer and DAG-based systems.
- Familiarity with open table formats (Iceberg) and DWH integration.
- Strong Python ( Scala/Java/Go a plus).
- Fluent use of AI coding agents (Cursor, Claude Code, Copilot).
๐ Benefits
- Remote-friendly with hybrid work and flexible hours.
- 5 paid volunteer days per year.
- RSUs or stock options depending on role and location.
- Annual $1,500 education budget for courses.
- Performance bonus and referral bonus opportunities.
- On-call rotation and reliability focus.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest โ finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!