У нас вы можете посмотреть бесплатно CSCI 3151 - M20 - Support vector machines & margins или скачать в максимальном доступном качестве, видео которое было загружено на ютуб. Для загрузки выберите вариант из формы ниже:
Если кнопки скачивания не
загрузились
НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием видео, пожалуйста напишите в поддержку по адресу внизу
страницы.
Спасибо за использование сервиса ClipSaver.ru
This module introduces support vector machines (SVMs) as maximum-margin classifiers building on linear models and logistic regression. We develop the geometric idea of a margin and support vectors, then derive and interpret the hard-margin and soft-margin SVM optimization problems, connecting the penalty parameter C to regularization, margin width, and training error. In the dual view, we show that only support vectors matter and that kernels (linear and RBF) enter through inner products, revisiting the kernel trick from the previous module. Using low-dimensional synthetic datasets, we visualize decision boundaries, margins, and support vectors while exploring how choices of C and gamma change model complexity, number of support vectors, and generalization, including a small grid search over hyperparameters. We also examine a failure case where extreme hyperparameters lead to overfitting, and emphasize practical issues such as feature scaling, leakage-free pipelines, proper train/validation/test splits, and appropriate evaluation metrics. By the end, students should be able to implement and tune linear and kernel SVMs in scikit-learn and understand when margin-based methods are an appropriate choice. Course module page: https://web.cs.dal.ca/~rudzicz/Teaching/CS...