У нас вы можете посмотреть бесплатно XGBoost Part 1 (of 4): Regression или скачать в максимальном доступном качестве, видео которое было загружено на ютуб. Для загрузки выберите вариант из формы ниже:
Если кнопки скачивания не
загрузились
НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием видео, пожалуйста напишите в поддержку по адресу внизу
страницы.
Спасибо за использование сервиса ClipSaver.ru
XGBoost is an extreme machine learning algorithm, and that means it's got lots of parts. In this video, we focus on the unique regression trees that XGBoost uses when applied to Regression problems. NOTE: This StatQuest assumes that you are already familiar with... The main ideas behind Gradient Boost for Regression: • Gradient Boost Part 1 (of 4): Regression M... ...and the main ideas behind Regularization: • Regularization Part 1: Ridge (L2) Regression Also note, this StatQuest is based on the following sources: The original XGBoost manuscript: https://arxiv.org/pdf/1603.02754.pdf And the XGBoost Documentation: https://xgboost.readthedocs.io/en/lat... For a complete index of all the StatQuest videos, check out: https://statquest.org/video-index/ If you'd like to support StatQuest, please consider... Patreon: / statquest ...or... YouTube Membership: / @statquest ...buying one of my books, a study guide, a t-shirt or hoodie, or a song from the StatQuest store... https://statquest.org/statquest-store/ ...or just donating to StatQuest! https://www.paypal.me/statquest Lastly, if you want to keep up with me as I research and create new StatQuests, follow me on twitter: / joshuastarmer 0:00 Awesome song and introduction 2:35 The initial prediction 3:11 Building an XGBoost Tree for regression 4:07 Calculating Similarity Scores 8:23 Calculating Gain to evaluate different thresholds 13:02 Pruning an XGBoost Tree 15:15 Building an XGBoost Tree with regularization 19:29 Calculating output values for an XGBoost Tree 21:39 Making predictions with XGBoost 23:54 Summary of concepts and main ideas Corrections: 16:50 I say "66", but I meant to say "62.48". However, either way, the conclusion is the same. 22:03 In the original XGBoost documents they use the epsilon symbol to refer to the learning rate, but in the actual implementation, this is controlled via the "eta" parameter. So, I guess to be consistent with the original documentation, I made the same mistake! :) #statquest #xgboost