Related skills
bigquery terraform python kubernetes airflow๐ Description
- Design and implement enterprise-scale data infra strategies and standards.
- Build and optimize multi-region data pipelines for petabyte-scale data with 99.9% reliability.
- Lead cost analyses; optimize data stack to reduce infra spend and improve performance.
- Provide technical guidance to data engineers; drive DataOps, security, governance.
- Evaluate emerging technologies; run PoCs; lead roadmap for data infra modernization.
๐ฏ Requirements
- 7+ years in data platform/engineer roles; architecture and scaling.
- Extensive experience with Google Cloud Platform, Kubernetes, Terraform; IAM security.
- Orchestrate data pipelines with Airflow or Dagster; deploy to cloud; BigQuery.
- Python programming; strong software development principles.
- Troubleshoot data infra; strong communicator to technical and non-technical audiences.
- API development with FastAPI; data governance; CI/CD for data projects; FinOps.
๐ Benefits
- Free comprehensive health insurance for you and your children.
- Parent Care: extra leave on top of legal parental leave.
- Free mental health and coaching services through Moka.care.
- Flexible remote policy for EU countries and the UK.
- Lunch voucher with Swile card.
- Up to 14 RTT days per year.
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Engineering Jobs. Just set your
preferences and Job Copilot will do the rest โ finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!