Related skills
bigquery snowflake sql dbt airflowπ Description
- Build and maintain the data pipelines and analytics infrastructure powering SRC
- Own batch and streaming pipelines generating core SRC datasets
- Develop analytics models transforming pipeline/service data into reports for reliability
- Partner with product managers, backend engineers, and insights teams to define metrics
- Maintain data export pipelines to ensure downstream data is timely and accurate
- Implement data quality practices including validation tests, alerts, and monitoring
π― Requirements
- Strong SQL skills and deep experience with data modelling and warehouse design
- Experience building and operating batch data pipelines at scale using Scio, Apache Beam, Spark
- Comfortable with modern cloud data warehouses such as BigQuery, Snowflake
- Experience with analytics engineering tools like dbt and layered data modeling (staging, transformation, reporting)
- Worked with workflow orchestration platforms such as Flyte, Airflow
- Production-quality code in Scala, Python, or Java
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!