Added
2 days ago
Type
Full time
Salary
Upgrade to Premium to se...

Related skills

snowflake etl sql python databricks

πŸ“‹ Description

  • Build and maintain data pipelines with PySpark, Python, and dbt for billions of records.
  • Improve customer integrations with tooling to speed onboarding.
  • Adopt AI tooling like LLM-assisted cleaning, semantic validation, and anomaly detection.
  • Collaborate across product, engineering, and GTM to deliver data solutions.
  • Optimize ETL runtimes and scalability to speed integrations.
  • Resolve data quality issues by working with messy customer data to extract signals.

🎯 Requirements

  • 2+ years building ETLs/data workflows with Python, PySpark, or SQL.
  • Experience turning messy data into structured, usable datasets.
  • Experience identifying automation to simplify workflows.
  • Experience or interest in Databricks, Snowflake, and dbt.
  • Strong problem-solving; works with ambiguous requirements to deliver impact.
  • Collaborative and communicative across teams.
Share job

Meet JobCopilot: Your Personal AI Job Hunter

Automatically Apply to Engineering Jobs. Just set your preferences and Job Copilot will do the rest β€” finding, filtering, and applying while you focus on what matters.

Related Engineering Jobs

See more Engineering jobs β†’