У нас вы можете посмотреть бесплатно How XGBoost Builds Smarter Decision Trees или скачать в максимальном доступном качестве, видео которое было загружено на ютуб. Для загрузки выберите вариант из формы ниже:
Если кнопки скачивания не
загрузились
НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием видео, пожалуйста напишите в поддержку по адресу внизу
страницы.
Спасибо за использование сервиса ClipSaver.ru
This video dives deep into XGBoost (Extreme Gradient Boosting), one of the most powerful machine learning algorithms. Learn how XGBoost builds decision trees, calculates similarity scores, and identifies optimal splits using gain metrics. This tutorial explains the step-by-step process of how XGBoost refines residuals and improves predictions. Course Link HERE: https://sds.courses/ml-2 You can also find us here: Website: https://www.superdatascience.com/ Facebook: / superdatascience Twitter: / superdatasci LinkedIn: / superdatascience Contact us at: [email protected] Chapters: 00:00 Introduction to XGBoost 00:34 Residuals and the First Model 01:04 XGBoost’s Unique Decision Tree Algorithm 01:36 Calculating Similarity Score and Gain 03:11 Determining the Best Split 05:17 Optimized Split Selection Algorithm 06:25 Stopping Criteria for Tree Splitting 08:03 Recap: Building the Second Model 09:06 XGBoost’s Sequential Refinement Process 09:36 Conclusion and Next Steps #XGBoost #GradientBoosting #MachineLearning #MLTutorial #DataScience #AIExplained #DecisionTrees #BoostingModels #MLConcepts #ErrorReduction #AI #PredictiveModeling #XGBoostTutorial #DataProcessing #mltipsandtricks The video is about XGBoost (Extreme Gradient Boosting), a powerful machine learning algorithm known for its efficiency and predictive performance. It provides a detailed explanation of how XGBoost builds and optimizes decision trees, focusing on: Residuals: How XGBoost starts with residuals from previous models to refine predictions. Similarity Score: A unique metric XGBoost uses to evaluate the homogeneity of data in tree splits. Gain Metric: How XGBoost calculates the improvement from potential splits to decide the best split. Optimized Splitting: The algorithm’s approach to discretizing data to find splits efficiently, avoiding exhaustive evaluations. Stopping Criteria: Rules such as tree depth, lack of gain, or insufficient data, which dictate when the algorithm stops splitting a tree. Sequential Process: How XGBoost builds trees iteratively, each tree refining the errors of the previous one. The video walks through the process of creating a single decision tree in XGBoost, explaining the calculations and logic behind the splits, and highlights what makes XGBoost unique compared to standard decision trees. It’s a technical yet accessible guide for those seeking to understand and implement XGBoost in their machine learning workflows.