Related skills
docker terraform aws python kubernetes๐ Description
- Own end-to-end technical delivery of Unstructured AI deployments.
- Build and scale full-stack systems processing large volumes of unstructured content.
- Embed with customer teams to align metadata, governance, and security needs.
- Scope work, sequence delivery, and remove blockers for fast iterations.
- Balance scope, speed, and quality to move pilots to production.
- Codify repeatable deployment patterns into tools and playbooks.
๐ฏ Requirements
- Shipped production systems in real customer environments.
- Strong Python skills for data processing and API development.
- Built/deployed systems processing large-scale unstructured data.
- Data pipelines, microservice architecture, API design.
- Cloud infra: AWS, GCP, or Azure; Terraform; Docker/Kubernetes.
- Experience with LLM-based AI enrichment models.
- Metadata systems, data catalogs, or document AI workflows.
- Bachelor's degree or equivalent; Brussels work authorization.
๐ Benefits
- Flexible benefits program
- Competitive compensation, health coverage, and time off
- Inclusive onboarding and collaboration culture
- Accommodations for applicants available
- Equal opportunity employer
Meet JobCopilot: Your Personal AI Job Hunter
Automatically Apply to Engineering Jobs. Just set your
preferences and Job Copilot will do the rest โ finding, filtering, and applying while you focus on what matters.
Help us maintain the quality of jobs posted on Empllo!
Is this position not a remote job?
Let us know!