Related skills
looker snowflake sql python dbtπ Description
- Own data architecture end-to-end for flexible data.
- Automate operational workflows to provide data where needed.
- Enforce data quality at scale with tests and monitoring.
- Design and maintain scalable data models and analytics infrastructure.
- Enable insights with semantic layers, APIs, and real-time queries.
π― Requirements
- 5+ years building data warehouses, pipelines, and distributed systems
- Deep expertise with modern data stack: ingestion, transformation, dbt
- Orchestration with Airflow; BI with Looker/Metabase
- Experience with Snowflake, BigQuery, or Redshift and governance
- Production-grade Python or SQL; CI/CD and IaC workflows
- Collaborate across engineering, product, analytics to solve needs
π Benefits
- 100% paid health coverage
- Generous PTO and sick leave
- Lunch, snacks, and coffee provided
- Company retreats
- Opportunities to travel and see the impact of your work
- Hybrid work: in-office at least 3 days a week, with remote flexibility
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!