У нас вы можете посмотреть бесплатно ECE AI SEMINAR: Statistical physics, neural networks, and neuroscience: from then to now или скачать в максимальном доступном качестве, видео которое было загружено на ютуб. Для загрузки выберите вариант из формы ниже:
Если кнопки скачивания не
загрузились
НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием видео, пожалуйста напишите в поддержку по адресу внизу
страницы.
Спасибо за использование сервиса ClipSaver.ru
In early 80s, machine learning experienced a magnificent change. Work by John Hopfield and Geoffrey Hinton, published between 1982 and 1986, put forward several important concepts that are at the foundation of much current work: associative memories, recurrent neural networks, generative models, layered neural networks trained by gradient descent. The motivation and presentation style of this work attracted the attention of a generation of theoretical physicists trained in statistical physics who embraced this new area of research using their conceptual, mathematical, and computational tools. In addition to its contribution to current AI, this line of work opened a new field: theoretical and computational neuroscience. I will review the early work with an emphasis on its importance on laying the foundations for so much subsequent work, and provide some examples of current applications to neuroscience research.