Related skills
bigquery terraform python kubernetes dbt๐ Description
- Design and implement enterprise-scale data infrastructure strategies
- Build multi-region data pipelines for petabyte-scale data
- Ensure 99.9% reliability with advanced monitoring and alerting
- Lead cost analysis to reduce infra spend while boosting performance
- Provide technical guidance to data engineers and cross-functional teams
- Define roadmap for data infra modernization
๐ฏ Requirements
- 7+ years of experience as Staff Data Platform Engineer or similar
- GCP experience with Kubernetes and Terraform for automated deployments
- Expert in Airflow or Dagster for data pipelines
- Deep BigQuery and modern data warehouses experience
- Proficient in Python and software development principles
- Excellent troubleshooter and clear communicator
๐ Benefits
- Health insurance for you and your children
- 25 days vacation + up to 14 RTT
- Relocation support for international mobility
- Lunch vouchers worth โฌ8.50/day
- 50% reimbursement of your public transport subscription
- Parent Care Program: extra leave
๐ Relocation support
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Data Jobs. Just set your
preferences and Job Copilot will do the rest โ finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!