У нас вы можете посмотреть бесплатно A New Neural Network Class: Creating the Framework или скачать в максимальном доступном качестве, видео которое было загружено на ютуб. Для загрузки выберите вариант из формы ниже:
Если кнопки скачивания не
загрузились
НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием видео, пожалуйста напишите в поддержку по адресу внизу
страницы.
Спасибо за использование сервиса ClipSaver.ru
By identifying the key “success elements” that make existing neural networks so effective, we can discern important features that will help us as we create the next generation of neural networks. Here, we examine two neural network architectural classes identified in the “Logical Topology of Neural Networks.” Of these two classes, class “a” includes the Multilayer Perceptron (MLP) neural networks, trained with some form of stochastic gradient descent (e.g., backpropagation), and class “b” includes the (Little-)Hopfield neural network, the Boltzmann machine, and the Restricted Boltzmann Machine (RBM). These latter networks are all trained by minimizing an Ising-form free energy equation. The key features that make these networks effective are that they include latent variables, and train the connection weights by minimizing an equation – either sum-squared-error (for class “a” neural networks) or a free energy equation (for class “b” neural networks). These two key features – latent variables and equation minimization – are our take-aways for assessing the next neural networks classes. This is a re-edit of a previous YouTube video. Opt-In HERE: www.themesis.com/themesis/ Visit Themesis for the associated blogpost (good links!): https://themesis.com/2023/05/16/new-n... Subscribe to the Themesis YouTube channel easily - click this link: https://www.youtube.com/@themesisinc....