Data Platform & Engineering Intern

Added
10 days ago
Type
Internship
Salary
Salary not provided

Related skills

sql python dbt data modeling airflow

πŸ“‹ Description

  • Build and improve data pipelines using SQL and Python
  • Contribute to data platform infrastructure reliability
  • Help design and model datasets for analytics and AI workflows
  • Improve data quality, observability, and governance across systems
  • Support infrastructure and contextual data layers for company-wide agents
  • Partner with engineering, product, finance, and ops to scale data solutions

🎯 Requirements

  • Currently pursuing a BS or MS in CS, DS, or related field
  • Strong proficiency in SQL and Python
  • Solid understanding of data modeling principles and relational databases
  • Familiarity with data pipelines, ETL/ELT concepts, or workflow orchestration
  • Based in SF/Bay Area and open to commuting to SF HQ 2-3 days a week
  • Comfortable working in a collaborative, fast-moving engineering environment

🎁 Benefits

  • Nice to Have: Experience with dbt, Airflow, Snowflake, BigQuery
  • Nice to Have: Exposure to cloud infrastructure (AWS, GCP, or similar)
  • Nice to Have: Experience working with large datasets or analytics systems
  • Nice to Have: Interest in AI/ML systems or agent-based workflows
Share job

Meet JobCopilot: Your Personal AI Job Hunter

Automatically Apply to Data Jobs. Just set your preferences and Job Copilot will do the rest β€” finding, filtering, and applying while you focus on what matters.

Related Data Jobs

See more Data jobs β†’