У нас вы можете посмотреть бесплатно Self-Referential Meta Learning ICML & AutoML 2022 или скачать в максимальном доступном качестве, видео которое было загружено на ютуб. Для загрузки выберите вариант из формы ниже:
Если кнопки скачивания не
загрузились
НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием видео, пожалуйста напишите в поддержку по адресу внизу
страницы.
Спасибо за использование сервиса ClipSaver.ru
We investigate self-referential meta learning systems that modify themselves without the need for explicit meta optimization. Meta Learning automates the search for learning algorithms. At the same time, it creates a dependency on human engineering on the meta-level, where meta learning algorithms need to be designed. In this paper, we investigate self-referential meta learning systems that modify themselves without the need for explicit meta optimization. We discuss the relationship of such systems to memory-based meta learning and show that self-referential neural networks require functionality to be reused in the form of parameter sharing. Finally, we propose Fitness Monotonic Execution (FME), a simple approach to avoid explicit meta optimization. A neural network self-modifies to solve bandit and classic control tasks, improves its self-modifications, and learns how to learn, purely by assigning more computational resources to better performing solutions. Website: http://louiskirsch.com/self-ref Workshop paper: https://openreview.net/forum?id=adt25... Kirsch, Louis, and Jürgen Schmidhuber. "Self-Referential Meta Learning." Decision Awareness in Reinforcement Learning Workshop at ICML. 2022. Kirsch, Louis, and Jürgen Schmidhuber. "Self-Referential Meta Learning." First Conference on Automated Machine Learning (Late-Breaking Workshop). 2022.