Related skills
bigquery snowflake sql dbt airflowπ Description
- Build and maintain data pipelines and analytics infra for SRC.
- Own batch and streaming pipelines for core SRC datasets.
- Develop analytics models turning data into reliability and ROI reporting.
- Partner with product, backend, and insights teams to define metrics.
- Maintain data export pipelines to ensure timely and accurate data.
- Implement data quality practices with tests, alerts, and monitoring.
π― Requirements
- Strong SQL skills with data modeling and warehouse design.
- Build batch pipelines at scale using Scio, Apache Beam, Spark.
- Comfortable with BigQuery, Snowflake, or similar warehouses.
- Experience with dbt and layered data modeling (staging, transformation, reporting).
- Experience with Flyte or Airflow for workflows.
- Production-quality code in Scala, Python, or Java.
π Benefits
- Health insurance
- Six-month paid parental leave
- 401(k) retirement plan
- 23 paid days off
- Paid flexible holidays
- Paid sick leave
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!