• ClipSaver
ClipSaver
Русские видео
  • Смешные видео
  • Приколы
  • Обзоры
  • Новости
  • Тесты
  • Спорт
  • Любовь
  • Музыка
  • Разное
Сейчас в тренде
  • Фейгин лайф
  • Три кота
  • Самвел адамян
  • А4 ютуб
  • скачать бит
  • гитара с нуля
Иностранные видео
  • Funny Babies
  • Funny Sports
  • Funny Animals
  • Funny Pranks
  • Funny Magic
  • Funny Vines
  • Funny Virals
  • Funny K-Pop

Stanford Seminar - Computing with High-Dimensional Vectors скачать в хорошем качестве

Stanford Seminar - Computing with High-Dimensional Vectors 7 years ago

Stanford

Stanford University

Seminar

ee380

Pentti Kanerva

Stanford CSLI

UC Berkeley Redwood Center for Theoretical Neuroscience

High-Dimensional Vectors

Computing power

high-dimensional vectors

emerging nanotechnology

high-dimensional computing

Не удается загрузить Youtube-плеер. Проверьте блокировку Youtube в вашей сети.
Повторяем попытку...
Stanford Seminar - Computing with High-Dimensional Vectors
  • Поделиться ВК
  • Поделиться в ОК
  •  
  •  


Скачать видео с ютуб по ссылке или смотреть без блокировок на сайте: Stanford Seminar - Computing with High-Dimensional Vectors в качестве 4k

У нас вы можете посмотреть бесплатно Stanford Seminar - Computing with High-Dimensional Vectors или скачать в максимальном доступном качестве, видео которое было загружено на ютуб. Для загрузки выберите вариант из формы ниже:

  • Информация по загрузке:

Скачать mp3 с ютуба отдельным файлом. Бесплатный рингтон Stanford Seminar - Computing with High-Dimensional Vectors в формате MP3:


Если кнопки скачивания не загрузились НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием видео, пожалуйста напишите в поддержку по адресу внизу страницы.
Спасибо за использование сервиса ClipSaver.ru



Stanford Seminar - Computing with High-Dimensional Vectors

EE380: Computer Systems Colloquium Seminar Computing with High-Dimensional Vectors Speaker: Pentti Kanerva, Stanford CSLI & UC Berkeley Redwood Center for Theoretical Neuroscience Computing with high-dimensional vectors complements traditional computing and occupies the gap between symbolic AI and artificial neural nets. Traditional computing treats bits, numbers, and memory pointers as basic objects on which all else is built. I will consider the possibility of computing with high-dimensional vectors as basic objects, for example with 10,000-bit words, when no individual bit nor subset of bits has a meaning of its own--when any piece of information encoded into a vector is distributed over all components. Thus a traditional data record subdivided into fields is encoded as a high-dimensional vector with the fields superposed. Computing power arises from the operations on the basic objects--from what is called their algebra. Operations on bits form Boolean algebra, and the addition and multiplication of numbers form an algebraic structure called a "field." Two operations on high-dimensional vectors correspond to the addition and multiplication of numbers. With permutation of coordinates as the third operation, we end up with a system of computing that in some ways is richer and more powerful than arithmetic, and also different from linear algebra. Computing of this kind was anticipated by von Neumann, described by Plate, and has proven to be possible in high-dimensional spaces of different kinds. The three operations, when applied to orthogonal or nearly orthogonal vectors, allow us to encode, decode and manipulate sets, sequences, lists, and arbitrary data structures. One reason for high dimensionality is that it provides a nearly endless supply of nearly orthogonal vectors. Making of them is simple because a randomly generated vector is approximately orthogonal to any vector encountered so far. The architecture includes a memory which, when cued with a high-dimensional vector, finds its nearest neighbors among the stored vectors. A neural-net associative memory is an example of such. Circuits for computing in high-D are thousands of bits wide but the components need not be ultra-reliable nor fast. Thus the architecture is a good match to emerging nanotechnology, with applications in many areas of machine learning. I will demonstrate high-dimensional computing with a simple algorithm for identifying languages. About the Speaker: Pentti Kanerva came to Stanford from Finland in 1967 to work in Patrick Suppes' computer laboratory dedicated to computer-assisted instruction. In addition to programming, he designed and built hardware to network city-wide clusters of computer terminals. Kanerva's life-long interest in understanding brains in computing terms motivates his study of computation with high-dimensional vectors. His PhD thesis in Philosophy was published in the book Sparse Distributed Memory, MIT Press, and led to research at NASA Ames Research Center, Swedish Institute of Computer Science, Redwood Neuroscience Institute and, presently, UC Berkeley's Redwood Center for Theoretical Neuroscience. This research has been published in papers on Binary Spatter Code, Random Indexing, and Hyperdimensional Computing. For more information about this seminar and its speaker, you can visit http://ee380.stanford.edu/Abstracts/1... Support for the Stanford Colloquium on Computer Systems Seminar Series provided by the Stanford Computer Forum. Colloquium on Computer Systems Seminar Series (EE380) presents the current research in design, implementation, analysis, and use of computer systems. Topics range from integrated circuits to operating systems and programming languages. It is free and open to the public, with new lectures each week. Learn more: http://bit.ly/WinYX5

Comments
  • Stanford Seminar - Concatenative Programming: From Ivory to Metal 7 years ago
    Stanford Seminar - Concatenative Programming: From Ivory to Metal
    Опубликовано: 7 years ago
    11751
  • Lecture 1 | String Theory and M-Theory 14 years ago
    Lecture 1 | String Theory and M-Theory
    Опубликовано: 14 years ago
    2408087
  • Stanford Seminar - Unethical Algorithms of Massive Scale 7 years ago
    Stanford Seminar - Unethical Algorithms of Massive Scale
    Опубликовано: 7 years ago
    15114
  • What do tech pioneers think about the AI revolution? - BBC World Service 9 months ago
    What do tech pioneers think about the AI revolution? - BBC World Service
    Опубликовано: 9 months ago
    1637804
  • 2 years ago
    "Vector Symbolic Architectures In Clojure" by Carin Meier
    Опубликовано: 2 years ago
    6972
  • Neural manifolds - The Geometry of Behaviour 3 years ago
    Neural manifolds - The Geometry of Behaviour
    Опубликовано: 3 years ago
    293861
  • Stanford Seminar: Beyond Floating Point: Next Generation Computer Arithmetic 8 years ago
    Stanford Seminar: Beyond Floating Point: Next Generation Computer Arithmetic
    Опубликовано: 8 years ago
    30429
  • Stanford Seminar - Generalized Reversible Computing and the Unconventional Computing Landscape 7 years ago
    Stanford Seminar - Generalized Reversible Computing and the Unconventional Computing Landscape
    Опубликовано: 7 years ago
    14201
  • But what is a neural network? | Deep learning chapter 1 7 years ago
    But what is a neural network? | Deep learning chapter 1
    Опубликовано: 7 years ago
    19508049
  • Lecture 1 | New Revolutions in Particle Physics: Basic Concepts 15 years ago
    Lecture 1 | New Revolutions in Particle Physics: Basic Concepts
    Опубликовано: 15 years ago
    841906

Контактный email для правообладателей: [email protected] © 2017 - 2025

Отказ от ответственности - Disclaimer Правообладателям - DMCA Условия использования сайта - TOS