У нас вы можете посмотреть бесплатно Understanding Coordinate Descent или скачать в максимальном доступном качестве, видео которое было загружено на ютуб. Для загрузки выберите вариант из формы ниже:
Если кнопки скачивания не
загрузились
НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием видео, пожалуйста напишите в поддержку по адресу внизу
страницы.
Спасибо за использование сервиса ClipSaver.ru
Course link: https://www.coursera.org/learn/ml-reg... let's just have a little aside on the coordinate decent algorithm, and then we're gonna describe how to apply coordinate descent to solving our lasso objective. So, our goal here is to minimize sub function g. So, this is the same objective that we have whether we are talking about our closed form solution, gradient descent, or this coordinate descent algorithm. But, let me just be very explicit, where. We're saying we wanna minimize over all possible w some g(w), where here, we're assuming g(w) is function of multiple variables. Let's call it g(w0,w1,...,wD). So this W we are trying to write in some bold font here. And often, minimizing over a large set of variables can be a very challenging problem. But in contrast, often it's possible to think about optimizing just a single dimension, keeping all of the other dimensions fixed. So easy for each coordinate when keeping others fixed, because that turns into just a 1D optimization problem. And so, that's the motivation behind coordinate decent, where the coordinate descent algorithm, it's really intuitive.