У нас вы можете посмотреть бесплатно Robust Weight Imprinting: Insights from Neural Collapse and Proxy-Based Aggregation [TMLR, 2025] или скачать в максимальном доступном качестве, видео которое было загружено на ютуб. Для загрузки выберите вариант из формы ниже:
Если кнопки скачивания не
загрузились
НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием видео, пожалуйста напишите в поддержку по адресу внизу
страницы.
Спасибо за использование сервиса ClipSaver.ru
Enjoy my video presentation on our work "Robust Weight Imprinting: Insights from Neural Collapse and Proxy-Based Aggregation"; a collaboration between me (Justus Westerhoff), Golzar Atefi, Mario Koddenbrock, Alexei Figueroa, Alexander Löser, Erik Rodner, and Felix A. Gers. The paper was accepted at TMLR in 12/2025. We are really thankful for the convenient and responsive process, and for the detailed feedback of the reviewers and Action Editor that greatly improved our paper. Paper: https://arxiv.org/abs/2503.14572 Code: https://github.com/DATEXIS/IMPRINT OpenReview: https://openreview.net/forum?id=duU11... Slides: https://tinyurl.com/IMPRINT-slides Abstract: The capacity of foundation models allows for their application to new, unseen tasks. The adaptation to such tasks is called transfer learning. An efficient transfer learning method that circumvents parameter optimization is imprinting. The conceptual differences between studies on imprinting form the basis of our systematic investigation. In this work, we propose the general IMPRINT framework, identifying three main components: generation, normalization, and aggregation. Through the lens of this framework, we conduct an in-depth analysis and a comparison of the existing methods. Our findings reveal the benefits of representing novel data with multiple proxies in the generation step and show the importance of proper normalization. Beyond an extensive analytical grounding, our framework enables us to propose a novel variant of imprinting which outperforms previous work on transfer learning tasks by 4%. This variant determines proxies through clustering motivated by the neural collapse phenomenon -- a connection that we draw for the first time. This work was funded by the Deutsche Forschungsgemeinschaft (DFG, German Research Foundation) -- FIP-12 -- Project-ID 528483508 Recorded on 05.12.2025 using OBS.