У нас вы можете посмотреть бесплатно [H-JEPA] Hierarchical Joint Embedding Predictive Architecture (V-JEPA) for Autonomous Intelligence или скачать в максимальном доступном качестве, видео которое было загружено на ютуб. Для загрузки выберите вариант из формы ниже:
Если кнопки скачивания не
загрузились
НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием видео, пожалуйста напишите в поддержку по адресу внизу
страницы.
Спасибо за использование сервиса ClipSaver.ru
We’ve all felt the excitement surrounding Large Language Models, but if you listen to the pioneers of the field—the 'Godfathers of Deep Learning' like Yann LeCun—they’ll tell you that we’ve hit a wall. Current AI can predict the next word, but it doesn't understand the world. It can write code, but it lacks the basic common sense of a house cat. So, how do we move from 'Chatbots' to 'Autonomous Machine Intelligence'? Today, we are exploring a radical departure from the status quo. We’re diving into Latent Variable Energy-Based Models. This isn't just a technical tweak; it’s a philosophical shift. We’re moving away from probabilistic generative models that try to predict every pixel, and moving toward architectures that understand the physics of reality. In this episode, we unpack LeCun's vision for a modular AI, focusing on: The JEPA Framework: Why predicting in 'abstract representation space' is the key to efficiency. Hierarchical Planning (H-JEPA): How machines can learn to plan across different time scales—from milliseconds to months. The Role of Latent Variables: Solving the problem of high-dimensional uncertainty so machines can navigate the messy, unpredictable real world. Energy Minimization: The mathematical engine that allows an AI to reason through common-sense physics. If you’ve ever wondered what comes after the LLM craze, this is the roadmap. Let’s get into the architecture of the future