Related skills
snowflake sql python databricks dbtπ Description
- Build tools/frameworks to streamline customer integrations and onboarding.
- Create robust ETLs in PySpark and DBT to process billions of records.
- Evaluate and adopt new data technologies to solve current and future needs.
- Collaborate with product, engineering, and GTM teams to deliver data solutions.
- Identify ETL/runtime optimizations to improve scalability.
- Address data quality with messy/incomplete data to extract signals.
π― Requirements
- Significant experience designing and maintaining ETLs for large-scale datasets.
- Proficiency with Python, PySpark, SQL and Databricks, Snowflake, or DBT.
- Strong problem-solving with ambiguous requirements to deliver impactful solutions.
- Focus on practical outcomes, balancing rigor with delivery.
- Experience with complex, unclean datasets and innovative processing.
- Excellent communication and leadership; mentor others.
π Benefits
- Join a mission-driven company reducing millions of pounds of food waste.
- Tackle real-world problems with direct customer impact.
- Collaborative, supportive team where ideas are valued.
- Use cutting-edge tools and platforms to solve data challenges.
- Remote-friendly with state eligibility.
- Growth opportunities and impact.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Engineering Jobs. Just set your
preferences and Job Copilot will do the rest β finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!