Related skills
bigquery redshift github actions snowflake pythonπ Description
- Design clean data models and metrics with dbt.
- Build and refine pipelines ingesting data from ops systems to analytics.
- Maintain data platform infrastructure focusing on quality and ELT efficiency.
- Architect analytics components: BI, semantic layers, and data warehouse.
- Develop and maintain streaming pipelines from multiple sources.
- Collaborate with teams to capture and provide data needs.
π― Requirements
- 5+ years in data/analytics engineering.
- Python and orchestration tools (Prefect or Dagster).
- Proficient with dbt; data modeling and ELT concepts.
- Snowflake, BigQuery, or Redshift experience; data warehouses.
- Build/maintain modern data pipelines (streaming and batch).
- CI/CD and IaC (GitHub Actions, Terraform); mention "coffee" in your application.
π Benefits
- Health/dental options (employee 100%, family 50%).
- Vision insurance.
- 401(k) match of 4%.
- PTO 4 weeks + year two increases.
- 12 holidays + 2 floating holidays.
- Remote working friendly since 2012.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Engineering Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!