Related skills
bigquery terraform python kubernetes airflow๐ Description
- Design and implement enterprise-scale data infrastructure.
- Build multi-region data pipelines for petabyte-scale data.
- Ensure 99.9% reliability with monitoring and alerts.
- Lead cost analysis to optimize data stack spend.
- Provide technical guidance and architecture reviews for DataOps.
- Evaluate new data tools and drive the data infrastructure roadmap.
๐ฏ Requirements
- 7+ years as Staff Data Platform Engineer or similar.
- Deep experience with Google Cloud Platform, Kubernetes and Terraform.
- Expertise in Airflow or Dagster for data pipelines.
- Strong Python programming and software dev principles.
- Proficiency in data warehouses such as BigQuery.
- Excellent troubleshooting and cross-functional communication.
๐ Benefits
- Health insurance for you and your children
- Additional parental leave (1 extra month)
- Free mental health and coaching services
- EU/UK flexibility days up to 10 days per year
- RTT up to 14 days
- Lunch voucher with Swile card
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest โ finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!