У нас вы можете посмотреть бесплатно Thiago Serra: "Scaling Up Exact Neural Network Compression by ReLU Stability" или скачать в максимальном доступном качестве, видео которое было загружено на ютуб. Для загрузки выберите вариант из формы ниже:
Если кнопки скачивания не
загрузились
НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием видео, пожалуйста напишите в поддержку по адресу внизу
страницы.
Спасибо за использование сервиса ClipSaver.ru
Deep Learning and Combinatorial Optimization 2021 "Scaling Up Exact Neural Network Compression by ReLU Stability" Thiago Serra - Bucknell University Abstract: We can compress a neural network while exactly preserving its underlying functionality with respect to a given input domain if some of its neurons are stable. However, current approaches to determine the stability of neurons require solving or finding a good approximation to multiple discrete optimization problems. In this talk, we present an algorithm based on solving a single optimization problem to identify all stable neurons. Our approach is on median 21 times faster than the state-of-art method, which allows us to explore exact compression on deeper (5 x 100) and wider (2 x 800) networks within minutes. For classifiers trained under an amount of L1 regularization that does not worsen accuracy, we can remove up to 40% of the connections. This talk is based on joint work with Srikumar Ramalingam (Google Research) and Abhinav Kumar (Michigan State University). Institute for Pure and Applied Mathematics, UCLA February 23, 2021 For more information: https://www.ipam.ucla.edu/dlc2021