Related skills
sql python dbt data modeling airflowπ Description
- Build and improve data pipelines using SQL and Python
- Contribute to data platform infrastructure reliability
- Help design and model datasets for analytics and AI workflows
- Improve data quality, observability, and governance across systems
- Support infrastructure and contextual data layers for company-wide agents
- Partner with engineering, product, finance, and ops to scale data solutions
π― Requirements
- Currently pursuing a BS or MS in CS, DS, or related field
- Strong proficiency in SQL and Python
- Solid understanding of data modeling principles and relational databases
- Familiarity with data pipelines, ETL/ELT concepts, or workflow orchestration
- Based in SF/Bay Area and open to commuting to SF HQ 2-3 days a week
- Comfortable working in a collaborative, fast-moving engineering environment
π Benefits
- Nice to Have: Experience with dbt, Airflow, Snowflake, BigQuery
- Nice to Have: Exposure to cloud infrastructure (AWS, GCP, or similar)
- Nice to Have: Experience working with large datasets or analytics systems
- Nice to Have: Interest in AI/ML systems or agent-based workflows
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!