Added
13 minutes ago
Type
Full time
Salary
Salary not provided

Related skills

bigquery python kubernetes ai data pipelines

πŸ“‹ Description

  • End-to-End pipeline engineering: design, build, deploy scalable pipelines from APIs.
  • Model and transform data to unify sources into high-fidelity data products.
  • Architectural improvements: modernize legacy processes into scalable data architectures.
  • Operational excellence: observability, automated testing, automation, and self-healing features.
  • Investigate complex issues; deep-dive analyses to root causes.
  • Drive velocity by applying AI and automation to speed up development.

🎯 Requirements

  • Python for large-scale data processing.
  • Data cloud tech and streaming systems (Kafka, Pub/Sub).
  • Cloud databases: BigQuery, Spanner; GCP/AWS/Azure.
  • Hands-on with Kubernetes in production.
  • Experience designing data pipelines and ETL.
  • English communication and stakeholder management.

🎁 Benefits

  • Flexible time off: autonomy to manage work-life balance.
  • Career development: workshops, frameworks, trainings.
  • Impactful work: shape products used by 85,000+ users.
  • Mobility options: mobility budget or company car.
  • Net allowance: support for home office expenses.
  • Comprehensive health insurance for you and dependents.
Share job

Meet JobCopilot: Your Personal AI Job Hunter

Automatically Apply to Data Jobs. Just set your preferences and Job Copilot will do the rest β€” finding, filtering, and applying while you focus on what matters.

Related Data Jobs

See more Data jobs β†’