У нас вы можете посмотреть бесплатно Universal Conditional Gradient Sliding for Convex Optimization или скачать в максимальном доступном качестве, видео которое было загружено на ютуб. Для загрузки выберите вариант из формы ниже:
Если кнопки скачивания не
загрузились
НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием видео, пожалуйста напишите в поддержку по адресу внизу
страницы.
Спасибо за использование сервиса ClipSaver.ru
We present a first-order projection-free method, namely, the universal conditional gradient sliding (UCGS) method, for solving approximate solutions to convex differentiable optimization problems. For objective functions with Lipschitz continuous gradients, we show that UCGS is able to terminate with approximate solutions and its complexity for gradient evaluations and linear objective subproblems matches the state-of-the-art upper and lower complexity bound results on first-order projection free method. It also adds more features allowing for practical implementation. In the weakly smooth case when the gradient is Hölder continuous, both the gradient and linear objective complexity results of UCGS improve the current state-of-the-art upper complexity results. Within the class of sliding-type algorithms, to the best of our knowledge, this is the first time a sliding-type algorithm is able to improve not only the gradient complexity but also the overall complexity for computing an approximate solution.