У нас вы можете посмотреть бесплатно Pop Goes the Stack | The Impact of Inference: Performance | AI или скачать в максимальном доступном качестве, видео которое было загружено на ютуб. Для загрузки выберите вариант из формы ниже:
Если кнопки скачивания не
загрузились
НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием видео, пожалуйста напишите в поддержку по адресу внизу
страницы.
Спасибо за использование сервиса ClipSaver.ru
Traditional performance meant deterministic response times. Identical inputs produced near-identical execution times. Optimizations reduced latency, but variance was minimal. Insert #AI inference and performance engineering has been flipped upside down. Latency depends on model size, tokenization, batching strategies, and generation settings. Identical inputs may produce different response times. The new dimension of performance is variance—not just how fast the system responds, but how response times distribute across requests, how many tokens per second are processed, and how efficient each response is relative to cost. In this episode of #F5's Pop Goes the Stack, Lori MacVittie, Joel Moses, and special guest Nina Forsyth dive into the impact of AI inference on measuring performance. It's time to rethink performance observability, focus on infrastructure optimization, agent-to-agent interactions, and robust measurement techniques. Listen in to learn how traditional approaches must evolve to manage this multi-dimensional puzzle. Chapters: 00:00 Welcome to Pop Goes the Stack 00:36 Once upon a time: Deterministic performance 02:27 Inference and the shift to non-deterministic performance 03:42 The human factor in AI latency tolerance 05:30 AI system variability: Performance measurement and cost optimization challenges 07:01 Optimizing for non-deterministic AI 08:51 Measuring AI performance: New metrics 10:41 Observability is key 13:37 Does performance management need a multi-layered infrastructure? 16:47 Key takeaways: New performance definition, start with infrastructure Find out more in the blog, "How AI inference changes application delivery": https://go.f5.net/w9barr3j Learn how you can stay ahead of the curve and keep your stack whole with additional insights on app security, multicloud, AI, and emerging tech: https://go.f5.net/ieoxk0fj More about F5: https://go.f5.net/4c0zuulu Read our blog: https://go.f5.net/sw5ktzmn Follow us on LinkedIn: https://go.f5.net/hzhd02ai