Software Engineer, Data

Added
1 day ago
Type
Full time
Salary
Upgrade to Premium to se...

Related skills

snowflake etl python databricks dbt

πŸ“‹ Description

  • Build and maintain data pipelines ingesting customer data with PySpark, Python, dbt.
  • Improve integrations with new customers using tooling for faster, repeatable processes.
  • Contribute to AI tooling adoption (LLM-assisted data cleaning, semantic validation).
  • Collaborate with product, engineering, and GTM teams to deliver data solutions.
  • Identify and optimize ETL to improve runtime and scalability.
  • Solve real-world data quality challenges with messy data to extract signals.

🎯 Requirements

  • 2+ years building ETLs or data workflows with Python, PySpark, SQL, or similar.
  • Comfortable turning messy datasets into structured data.
  • Ability to identify automation opportunities to simplify workflows.
  • Experience or interest in Databricks, Snowflake, and dbt.
  • Strong problem-solving with ambiguous requirements.
  • Detail-oriented with robust, maintainable solutions.
  • Collaborative, communicative, and eager to learn.

🎁 Benefits

  • Remote-friendly with eligibility in listed US states.
  • Equal employment opportunities (EEO) for all employees and applicants.
  • Growth, learning, and collaboration opportunities.
  • Inclusive, mission-driven culture focused on reducing food waste.
Share job

Meet JobCopilot: Your Personal AI Job Hunter

Automatically Apply to Data Jobs. Just set your preferences and Job Copilot will do the rest β€” finding, filtering, and applying while you focus on what matters.

Related Data Jobs

See more Data jobs β†’