Related skills
bigquery sql python kubernetes airflow📋 Description
- Build and scale data pipelines from internal systems to the data warehouse.
- Enable data science and AI by maintaining tools for analysis, training, deployment.
- Protect data and privacy with secure storage, governance, and access controls.
- Run and evolve the data platform to be scalable, stable, and cost-efficient.
- Own core data tooling, e.g., Airflow and Airbyte on Kubernetes.
- Keep the lights on by providing operational support for pipelines and tools.
🎯 Requirements
- 3+ years building and operating production data systems.
- Python and SQL fluency; experience with Fivetran or Airbyte.
- CI/CD, Infrastructure as Code (Terraform), DataOps.
- Cloud know-how (GCP a plus) with BigQuery, PubSub, DataFlow, GKE, Airflow, Airbyte.
- Experience with streaming pipelines (Kafka, PubSub, DataFlow, Beam).
- Curious, collaborative, privacy-aware; GDPR-minded.
🎁 Benefits
- Impact at scale across 80+ categories.
- Learning and development plans and mentorship.
- Inclusive culture with 60+ nationalities.
- Catavouchers: €100; €50 on your birthday.
- Extra day off each year to pursue your passion.
- Extra leave for work anniversaries and life moments.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest — finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!