Software Engineer II - Analytics Data Engineering

Added
less than a minute ago
Type
Full time
Salary
Upgrade to Premium to se...

Related skills

aws sql python dbt airflow

๐Ÿ“‹ Description

  • Build and maintain scalable data pipelines using PySpark, Airflow, and dbt.
  • Tune Spark jobs and storage for low-latency enterprise analytics.
  • Treat data as a product: data contracts, documentation, and trust.
  • Ensure data freshness, volume, and schema changes across dashboards.
  • Partner with Product, Engineering, and AI/ML teams to support features.

๐ŸŽฏ Requirements

  • 2+ years in data engineering or data-intensive software.
  • Fluent in SQL and Python for data manipulation.
  • Hands-on with Spark (PySpark) and AWS/EMR.
  • dbt modeling; partitioning, schema evolution, lakehouse (Iceberg).
  • Latency-focused; materialized views and caching.
  • Collaborative and curious; partner with Product and Data Science.

๐ŸŽ Benefits

  • Health, welfare, and wellbeing benefits.
  • Equity options and sign-on rewards.
  • AI-forward, collaborative engineering culture.
Share job

Meet JobCopilot: Your Personal AI Job Hunter

Automatically Apply to Data Jobs. Just set your preferences and Job Copilot will do the rest โ€” finding, filtering, and applying while you focus on what matters.

Related Data Jobs

See more Data jobs โ†’