Senior Data Engineer

Added
12 hours ago
Type
Full time
Salary
Salary not provided

Related skills

azure sql python gcp databricks

πŸ“‹ Description

  • Design, build, and operate end-to-end data pipelines on Azure or GCP.
  • Implement lakehouse patterns (Delta Lake, medallion) for data products.
  • Deliver batch and streaming pipelines (Kafka, Pub/Sub, Event Hubs).
  • Write high-quality Python and SQL for data processing.
  • Apply data modeling, quality, lineage, and governance practices.
  • Set up CI/CD for data pipelines and infrastructure.

🎯 Requirements

  • 5+ years as a Data Engineer in cloud environments.
  • Azure (ADF, Databricks) or GCP (Dataflow, BigQuery) expertise.
  • Python and SQL for data processing.
  • Delta Lake and medallion lakehouse architecture knowledge.
  • Batch and streaming pipelines (Kafka, Pub/Sub, Event Hubs).
  • CI/CD for data pipelines and infrastructure.

🎁 Benefits

  • High-impact engineering role in cloud and lakehouse.
  • Build data products for analytics, AI/ML, and initiatives.
  • Hybrid collaboration with cross-functional teams.
  • Strong engineering culture focused on quality and ownership.
  • Access to Sytac's data community and development programs.
Share job

Meet JobCopilot: Your Personal AI Job Hunter

Automatically Apply to Data Jobs. Just set your preferences and Job Copilot will do the rest β€” finding, filtering, and applying while you focus on what matters.

Related Data Jobs

See more Data jobs β†’