У нас вы можете посмотреть бесплатно [Ilya top 30] Episode 2: ResNet или скачать в максимальном доступном качестве, видео которое было загружено на ютуб. Для загрузки выберите вариант из формы ниже:
Если кнопки скачивания не
загрузились
НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием видео, пожалуйста напишите в поддержку по адресу внизу
страницы.
Спасибо за использование сервиса ClipSaver.ru
In 2015, Microsoft Research solved the vanishing gradient problem with one elegant idea: skip connections. ResNet enabled training networks 150+ layers deep. ⏱️ TIMESTAMPS 0:00 Series Introduction 0:08 The Degradation Problem 0:42 The Residual Insight 1:13 The Skip Connection 1:40 Training the Impossible 2:17 The Legacy of ResNet 📊 KEY RESULTS • ResNet-152 achieved 3.57% top-5 error rate on ImageNet 2015 • First network to surpass human-level performance (~5%) • Enabled training networks 152+ layers deep • Won ILSVRC 2015 classification, detection, and localization • Introduced "identity shortcut connections" (skip connections) 🔬 TECHNICAL BREAKDOWN This video explains the core insight behind ResNet: instead of learning H(x) directly, learn the residual F(x) = H(x) - x. By adding skip connections that pass the input directly through, gradients can flow unimpeded through hundreds of layers. We cover the degradation problem, the residual learning framework, and why this simple change unlocked ultra-deep networks. 📄 RESOURCES Paper: "Deep Residual Learning for Image Recognition" (CVPR 2016) Authors: Kaiming He, Xiangyu Zhang, Shaoqing Ren, Jian Sun Paper URL: https://arxiv.org/abs/1512.03385 👤 ABOUT THE TEAM • Kaiming He - Lead researcher at Microsoft Research Asia, now at MIT • Xiangyu Zhang - Co-author, contributed to architecture design • Shaoqing Ren - Co-author, worked on Faster R-CNN • Jian Sun - Manager at MSRA, renowned computer vision researcher 🎯 WHY RESNET MATTERS ResNet's skip connection is now everywhere: transformers use them, diffusion models use them, even language models like GPT. The paper has 200,000+ citations. Every foundation model today builds on this insight that deeper networks need "information highways." 🎬 ABOUT THIS SERIES This is Episode 2 of "Ilya Sutskever's Top 30"—a journey through the papers that built the AI revolution. From AlexNet to GPT, we're exploring the ideas that transformed artificial intelligence. 🌐 ABOUT LUMOLAB Created with LumoLab — transform research papers into explainer videos. Try it yourself: https://lumolab.org 📬 Subscribe for more AI research explainers! #ResNet #DeepLearning #ComputerVision #SkipConnection #VanishingGradient #NeuralNetworks #ImageNet #MachineLearning #AIHistory #ResearchExplained #CVPR