У нас вы можете посмотреть бесплатно Open Foundation Models: Reproducible Science of Strongly Scalable Transferable Learning или скачать в максимальном доступном качестве, видео которое было загружено на ютуб. Для загрузки выберите вариант из формы ниже:
Если кнопки скачивания не
загрузились
НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием видео, пожалуйста напишите в поддержку по адресу внизу
страницы.
Спасибо за использование сервиса ClipSaver.ru
In this lecture, Jenia Jitsev explains how scaling laws can be used to predict the performance of foundation models and to systematically compare different learning methods. He shows that broad, generalist pre-training is key to robust generalization and effective transfer across tasks and domains, and how scaling laws make it possible to draw reliable conclusions about large-scale models from small-scale experiments. A second focus of the lecture is the importance of open foundation models, where data, training procedures, and evaluation are fully transparent - an essential prerequisite for reliable scaling-law studies and the collaborative development of new learning methods. He also discusses why established benchmarks often fail to capture major weaknesses of modern models and presents new evaluation techniques based on controlled variations of simple tasks that make generalization failures clearly visible. #foundationmodels #aiinresearch @FZJuelichDeResearch