У нас вы можете посмотреть бесплатно Geometry of Efficient Fine Tuning: LoRA, Intrinsic Dimension, Subspace Learning - Preethi Srinivasan или скачать в максимальном доступном качестве, видео которое было загружено на ютуб. Для загрузки выберите вариант из формы ниже:
Если кнопки скачивания не
загрузились
НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием видео, пожалуйста напишите в поддержку по адресу внизу
страницы.
Спасибо за использование сервиса ClipSaver.ru
Large pre-trained models are now the norm, making Parameter-Efficient Fine-Tuning techniques like LoRA essential to reduce computational and storage costs. But why do these methods work so well? This talk explores the theory of Intrinsic Dimension (ID)—the idea that neural networks often need far fewer effective directions to learn a task than their total parameters suggest. We’ll estimate a task’s ID via random subspace training on an MLP for MNIST, reproducing results from foundational papers. Then, we’ll compare how LoRA approximates subspace training in compute, training time, and accuracy—clarifying key design trade-offs. LoRA succeeds not just from engineering but by exploiting the low-dimensional structure revealed by ID. We also highlight PyTorch internals that enable flexible subspace training. This talk builds on a four-part blog series bridging theory and engineering. Track : AI, ML, Data Science Speakers : Preethi Srinivasan Read more : https://cfp.in.pycon.org/2025/talk/7E... Give feedback : https://cfp.in.pycon.org/2025/talk/7E... Copyright © 2025 PyCon India. Licensed under Attribution-NonCommercial-ShareAlike 4.0 (CC BY-NC-SA 4.0). https://creativecommons.org/licenses/...