У нас вы можете посмотреть бесплатно Deploying Lightweight AI Agents at the Healthcare Edge With K8s + Ollama - Gary Arora & Samarth Shah или скачать в максимальном доступном качестве, видео которое было загружено на ютуб. Для загрузки выберите вариант из формы ниже:
Если кнопки скачивания не
загрузились
НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием видео, пожалуйста напишите в поддержку по адресу внизу
страницы.
Спасибо за использование сервиса ClipSaver.ru
Don't miss out! Join us at our next Flagship Conference: KubeCon + CloudNativeCon events in Amsterdam, The Netherlands (23-26 March, 2026). Connect with our current graduated, incubating, and sandbox projects as the community gathers to further the education and advancement of cloud native computing. Learn more at https://kubecon.io Deploying Lightweight AI Agents at the Healthcare Edge With K8s + Ollama - Gary Arora & Samarth Shah, Deloitte AI agents are reshaping healthcare operations but traditional centralized LLMs come with challenges: high latency, data privacy concerns, and steep cloud costs. In this talk, we’ll explore how lightweight, Kubernetes-native deployments of AI agents powered by Ollama and K3s/MicroK8s enable intelligent, autonomous operations directly at the healthcare edge. We’ll walk through a real-world architecture where multi-agent systems orchestrate hospital workflows like patient triage, imaging coordination, and resource scheduling all without sending sensitive data offsite. You’ll see how small LLMs deployed locally can drive powerful workflows, the K8s primitives used to scale and monitor agents, and how this approach achieves both operational efficiency and regulatory compliance (HIPAA, GDPR). This talk blends cloud-native engineering, AI orchestration, and real healthcare needs and offers a blueprint for deploying resilient, scalable AI agent ecosystems anywhere edge computing is needed.