Added
7 days ago
Type
Full time
Salary
Upgrade to Premium to se...

Related skills

sql python dbt airflow kafka

πŸ“‹ Description

  • Own data architecture end-to-end for critical business data.
  • Build and operate streaming and batch pipelines across domains.
  • Design domain-oriented canonical data models for analytics and ML.
  • Enforce data quality with tests, lineage, monitoring, and reconciliation.
  • Automate operational workflows across services and warehouses.
  • Enable insights via semantic layers, APIs, and real-time queries.

🎯 Requirements

  • 3+ years as a data or software engineer building data warehouses or distributed systems.
  • Design and implement data models using dimensional, Data Vault, or ledger-style techniques.
  • Hands-on with Kafka, Debezium; dbt, Spark, Flink; Dagster, Airflow.
  • Operated cloud data warehouses (Snowflake, BigQuery, Redshift) incl. schema design and cost optimization.
  • Proficient in Python/SQL; CI/CD and infrastructure-as-code.
  • Collaborate across engineering, product, analytics to translate needs into data systems.

🎁 Benefits

  • Generous Holiday and Time off Policy
  • Health Insurance options including Medical, Dental, Vision
  • Work From Home Support
  • Home office setup allowance
  • Monthly allowance for cell phone and internet
  • Parental Leave and family planning benefits
Share job

Meet JobCopilot: Your Personal AI Job Hunter

Automatically Apply to Data Jobs. Just set your preferences and Job Copilot will do the rest β€” finding, filtering, and applying while you focus on what matters.

Related Data Jobs

See more Data jobs β†’