У нас вы можете посмотреть бесплатно GraphGeeks in Discussion: Emerging AI Memory with Dave Bechberger или скачать в максимальном доступном качестве, видео которое было загружено на ютуб. Для загрузки выберите вариант из формы ниже:
Если кнопки скачивания не
загрузились
НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием видео, пожалуйста напишите в поддержку по адресу внизу
страницы.
Спасибо за использование сервиса ClipSaver.ru
Watch Amy Hodler and Dave Bechberger dive into the crucial role of memory in advanced AI systems, especially at the intersection of graphs, knowledge graphs, and generative AI. Dave Bechberger, currently focusing on MCP servers, agentic memory, and semantic data layers, explains that memory is fundamental because standard LLM calls are atomic and lack recollection of prior interactions. An agent without memory lacks continuity for complex user interactions. The discussion breaks down three key types of memory and how graphs apply: Episodic Memory: Transactional details directly integrated into the context. Short-Term Memory: Session-based interactions that require compaction or summarization. Long-Term Memory: For extracting and storing patterns, trends, and preferences across multiple interactions. Papers Recommended: General Memory for LLMs primer: https://arxiv.org/abs/2310.08560 Zep Temporal KG for agent memory: https://arxiv.org/abs/2501.13956 Mem0 for Scalable Long Term Memory: https://arxiv.org/abs/2504.19413 Cognee approach that shows the overlap of problem spaces: https://arxiv.org/html/2505.24478v1 A-Mem for a dynamic approach to long-term memory: https://arxiv.org/abs/2502.12110