• ClipSaver
ClipSaver
Русские видео
  • Смешные видео
  • Приколы
  • Обзоры
  • Новости
  • Тесты
  • Спорт
  • Любовь
  • Музыка
  • Разное
Сейчас в тренде
  • Фейгин лайф
  • Три кота
  • Самвел адамян
  • А4 ютуб
  • скачать бит
  • гитара с нуля
Иностранные видео
  • Funny Babies
  • Funny Sports
  • Funny Animals
  • Funny Pranks
  • Funny Magic
  • Funny Vines
  • Funny Virals
  • Funny K-Pop

Yoshua Bengio | From System 1 Deep Learning to System 2 Deep Learning | NeurIPS 2019 скачать в хорошем качестве

Yoshua Bengio | From System 1 Deep Learning to System 2 Deep Learning | NeurIPS 2019 5 years ago

machine learning

data science

neural networks

Geoffrey Hinton

Yoshua Benjio

Andrej Kaparthy

Andrew Ng

Ian Goodfellow

GANs

Deep learning

mathematics

lecture

Terry Tao

Convolution

generative

AI

Artificial intelligence

Robot

Self driving cars

Google Brain

Alphago

Yann LeCunn

CMU

Facebook

Google

Microsoft

Research

Big data

Bitcoin

Blockchain

programming

computer science

neurips 2019

deep reinforcement learning

Не удается загрузить Youtube-плеер. Проверьте блокировку Youtube в вашей сети.
Повторяем попытку...
Yoshua Bengio | From System 1 Deep Learning to System 2 Deep Learning | NeurIPS 2019
  • Поделиться ВК
  • Поделиться в ОК
  •  
  •  


Скачать видео с ютуб по ссылке или смотреть без блокировок на сайте: Yoshua Bengio | From System 1 Deep Learning to System 2 Deep Learning | NeurIPS 2019 в качестве 4k

У нас вы можете посмотреть бесплатно Yoshua Bengio | From System 1 Deep Learning to System 2 Deep Learning | NeurIPS 2019 или скачать в максимальном доступном качестве, видео которое было загружено на ютуб. Для загрузки выберите вариант из формы ниже:

  • Информация по загрузке:

Скачать mp3 с ютуба отдельным файлом. Бесплатный рингтон Yoshua Bengio | From System 1 Deep Learning to System 2 Deep Learning | NeurIPS 2019 в формате MP3:


Если кнопки скачивания не загрузились НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием видео, пожалуйста напишите в поддержку по адресу внизу страницы.
Спасибо за использование сервиса ClipSaver.ru



Yoshua Bengio | From System 1 Deep Learning to System 2 Deep Learning | NeurIPS 2019

Slides: http://www.iro.umontreal.ca/~bengioy/... Summary: Past progress in deep learning has concentrated mostly on learning from a static dataset, mostly for perception tasks and other System 1 tasks which are done intuitively and unconsciously by humans. However, in recent years, a shift in research direction and new tools such as soft-attention and progress in deep reinforcement learning are opening the door to the development of novel deep architectures and training frameworks for addressing System 2 tasks (which are done consciously), such as reasoning, planning, capturing causality and obtaining systematic generalization in natural language processing and other applications. Such an expansion of deep learning from System 1 tasks to System 2 tasks is important to achieve the old deep learning goal of discovering high-level abstract representations because we argue that System 2 requirements will put pressure on representation learning to discover the kind of high-level concepts which humans manipulate with language. We argue that towards this objective, soft attention mechanisms constitute a key ingredient to focus computation on a few concepts at a time (a "conscious thought") as per the consciousness prior and its associated assumption that many high-level dependencies can be approximately captured by a sparse factor graph. We also argue how the agent perspective in deep learning can help put more constraints on the learned representations to capture affordances, causal variables, and model transitions in the environment. Finally, we propose that meta-learning, the modularization aspect of the consciousness prior and the agent perspective on representation learning should facilitate re-use of learned components in novel ways (even if statistically improbable, as in counterfactuals), enabling more powerful forms of compositional generalization, i.e., out-of-distribution generalization based on the hypothesis of localized (in time, space, and concept space) changes in the environment due to interventions of agents.

Comments
  • What do tech pioneers think about the AI revolution? - BBC World Service 9 months ago
    What do tech pioneers think about the AI revolution? - BBC World Service
    Опубликовано: 9 months ago
    1641181
  • How AI Is Unlocking the Secrets of Nature and the Universe | Demis Hassabis | TED 1 year ago
    How AI Is Unlocking the Secrets of Nature and the Universe | Demis Hassabis | TED
    Опубликовано: 1 year ago
    508873
  • David Duvenaud | Reflecting on Neural ODEs | NeurIPS 2019 5 years ago
    David Duvenaud | Reflecting on Neural ODEs | NeurIPS 2019
    Опубликовано: 5 years ago
    27788
  • Visualising software architecture with the C4 model - Simon Brown, Agile on the Beach 2019 5 years ago
    Visualising software architecture with the C4 model - Simon Brown, Agile on the Beach 2019
    Опубликовано: 5 years ago
    474866
  • But what is a neural network? | Deep learning chapter 1 7 years ago
    But what is a neural network? | Deep learning chapter 1
    Опубликовано: 7 years ago
    19523289
  • Geoffrey Hinton: Turing Award Lecture 4 years ago
    Geoffrey Hinton: Turing Award Lecture "The Deep Learning Revolution"
    Опубликовано: 4 years ago
    8461
  • Кто и как управляет Европой? Дудь – в Европарламенте 7 hours ago
    Кто и как управляет Европой? Дудь – в Европарламенте
    Опубликовано: 7 hours ago
    356301
  • Harvard Professor Explains Algorithms in 5 Levels of Difficulty | WIRED 1 year ago
    Harvard Professor Explains Algorithms in 5 Levels of Difficulty | WIRED
    Опубликовано: 1 year ago
    4212411
  • The AI Revolution Is Underhyped | Eric Schmidt | TED 2 weeks ago
    The AI Revolution Is Underhyped | Eric Schmidt | TED
    Опубликовано: 2 weeks ago
    1229130
  • Variational Autoencoders 7 years ago
    Variational Autoencoders
    Опубликовано: 7 years ago
    550413

Контактный email для правообладателей: [email protected] © 2017 - 2025

Отказ от ответственности - Disclaimer Правообладателям - DMCA Условия использования сайта - TOS