Added
1 day ago
Type
Full time
Salary
Salary not provided

Related skills

bigquery sql airflow kafka spark

๐Ÿ“‹ Description

  • Build, operate, and improve production data systems.
  • Translate business needs into batch and streaming pipelines.
  • Build data pipelines with Beam, Spark, or Flink.
  • Improve observability and data quality across pipelines.
  • Apply AI-enabled development to speed delivery.
  • Collaborate with product, analytics, and stakeholders.

๐ŸŽฏ Requirements

  • Bachelor's degree in CS/Engineering/Math or related field.
  • Data engineering fundamentals: batch and streaming.
  • Experience with Beam, Spark, or Flink; Dataflow runners.
  • SQL and data modeling; BigQuery/Snowflake/Redshift.
  • Kafka and event-driven architectures on GCP/AWS/Azure.
  • Airflow/Composer/Prefect for workflows.

๐ŸŽ Benefits

  • Global, inclusive workplace and growth opportunities.
  • Collaborative, ownership-driven culture.
  • AI-enabled tooling to improve throughput responsibly.
Share job

Meet JobCopilot: Your Personal AI Job Hunter

Automatically Apply to Data Jobs. Just set your preferences and Job Copilot will do the rest โ€” finding, filtering, and applying while you focus on what matters.

Related Data Jobs

See more Data jobs โ†’