У нас вы можете посмотреть бесплатно Elastic Net & Lasso Methods and Implementation in EViews или скачать в максимальном доступном качестве, видео которое было загружено на ютуб. Для загрузки выберите вариант из формы ниже:
Если кнопки скачивания не
загрузились
НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием видео, пожалуйста напишите в поддержку по адресу внизу
страницы.
Спасибо за использование сервиса ClipSaver.ru
Elastic Net and Lasso are powerful penalized regression methods designed to address multicollinearity and over-fitting by adding a penalty term to the conventional least squares objective function. Specifically, Lasso utilizes an L1-norm penalty (absolute values) that shrinks some regression coefficients exactly to zero, thereby performing automatic variable selection. On the other hand, Elastic Net flexibly combines the penalties of both Lasso (L1) and Ridge (L2-norm squared) regressions, making it exceptionally effective when analyzing datasets where the number of explanatory variables exceeds the number of observations (the "large p, small n" problem). To implement this method in EViews, the procedure is easily carried out using the Equation object. From the main menu, navigate to Quick - Estimate Equation..., and then select ENET - Elastic Net Regularization from the Method dropdown menu. On the Specification tab, enter your dependent variable followed by the list of independent variables (you can include c to represent an unpenalized intercept). Next, under the Parameter Specification section, select either Lasso or Elastic net from the Type menu. In the Lambda edit field, you can manually specify a penalty parameter value or leave it blank; if left blank, EViews will automatically search for the optimal Lambda value using cross-validation techniques. Once estimated, EViews provides various visual diagnostics, such as the Coefficient Path graphs, allowing you to easily evaluate how individual coefficients shrink as the penalty parameter increases.