Related skills
bigquery looker sql python airflowπ Description
- Design and build the DWH data platform (Snowflake, BigQuery, Databricks).
- Build robust data pipelines moving engagement data to DWHs.
- Implement batch and streaming ingestion (Kafka β GCS β Iceberg β DWH).
- Shape Loomi Analytics Agent data layer and agentic workflows.
- Co-design dashboards and analytics stack (semantic layers, BI tools).
- Leverage AI coding agents daily to accelerate development.
π― Requirements
- Solid data engineering with strong SQL and data modeling.
- Production-grade pipelines on GCP (BigQuery, Iceberg, Spark) and Airflow.
- Experience with orchestration tools (Airflow/Cloud Composer) and DAGs.
- Familiarity with open table formats (Iceberg) and engines.
- Strong Python (preferred); Scala/Java/Go also relevant; fluent AI coding agents.
- Data quality, lineage, observability; cross-team collaboration.
π Benefits
- Remote-first with distributed teams across Central Europe.
- Flexible working hours and remote/hub options.
- 5 paid volunteering days per year.
- Employee support programs (EAP) and Calm subscription.
- $1,500 professional education budget annually.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Engineering Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!