Related skills
bigquery sql python dbt airflowπ Description
- Build and maintain production ELT pipelines into BigQuery.
- Own data domains end-to-end from ingestion to marts.
- Write and maintain dbt models, tests, and docs.
- Develop Airflow DAGs on Cloud Composer to orchestrate data workflows.
- Implement data quality checks and monitoring.
- Optimize queries and models in BigQuery for cost/performance.
π― Requirements
- 3β5 years of experience in data engineering or related data infra role.
- Design scalable data pipelines and warehouse architectures.
- Strong GCP expertise: BigQuery, Cloud Storage, Cloud Composer, Pub/Sub, Dataflow.
- Hands-on dbt usage and Airflow experience at production scale.
- SQL proficiency (BigQuery SQL) and Python for data tasks.
- Familiarity with data modeling concepts (star schema, SCD).
π Benefits
- Remote-first culture
- Flexible PTO
- Health, dental and vision insurance
- 13 paid holidays
- Company volunteer days
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!