У нас вы можете посмотреть бесплатно Google’s TITANS AI Just Got a Real Memory - The AGI Breakthrough OpenAI Feared! или скачать в максимальном доступном качестве, видео которое было загружено на ютуб. Для загрузки выберите вариант из формы ниже:
Если кнопки скачивания не
загрузились
НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием видео, пожалуйста напишите в поддержку по адресу внизу
страницы.
Спасибо за использование сервиса ClipSaver.ru
Google may have just solved the one problem every AGI roadmap has been stuck on: memory. In this video, we break down how Google’s new Titans architecture and the MIRAS framework turn large language models from short-term pattern matchers into systems that can actually build their own long-term memory while they run. Instead of forcing everything through attention, Titans splits the job in two: local attention handles the present, while a separate deep neural memory learns from “surprise” signals and updates itself at test time—storing only the concepts that really matter over millions of tokens. We walk through how MIRAS reframes Transformers, RetNet, Mamba, RWKV, and Titans as variations of the same idea: associative memories with different choices for memory architecture, attentional bias, retention gates, and online learning rules. In that design space, Titans picks a deep memory with disciplined forgetting and gradient-based updates, which lets it keep fixed memory cost while scaling to contexts well beyond two million tokens and beating much larger Transformer models on long-range benchmarks like needle-in-a-haystack and BABILong. We also look at variants like Moneta, Yaad, and Memora, why deeper memories consistently win, and what happens when you stop treating retrieval hacks and vector databases as band-aids and start treating memory as the core of the model itself. By the end, you’ll see why Google is betting that the future of AI isn’t just “bigger Transformers,” but architectures that remember, adapt, and forget on purpose—and why Titans might be the first real blueprint for long-context agents, codebase-scale reasoning, and always-on AI systems that grow as they run. And if you want the real story behind the world’s fastest-moving AI breakthroughs, make sure to like and subscribe to Evolving AI for daily coverage.