У нас вы можете посмотреть бесплатно #262 или скачать в максимальном доступном качестве, видео которое было загружено на ютуб. Для загрузки выберите вариант из формы ниже:
Если кнопки скачивания не
загрузились
НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием видео, пожалуйста напишите в поддержку по адресу внизу
страницы.
Спасибо за использование сервиса ClipSaver.ru
In the rapidly evolving landscape of large language models (LLMs), the spotlight has largely focused on the decoder-only architecture. While these models have shown impressive capabilities across a wide range of generation tasks, the classic encoder-decoder architecture, such as T5 (The Text-to-Text Transfer Transformer), remains a popular choice for many real-world applications. Encoder-decoder models often excel at summarization, translation, QA, and more due to their high inference efficiency, design flexibility, and richer encoder representation for understanding input. Nevertheless, the powerful encoder-decoder architecture has received little relative attention. T5Gemma is a new collection of encoder-decoder LLMs developed by converting pretrained decoder-only models into the encoder-decoder architecture through a technique called adaptation. T5Gemma is based on the Gemma 2 framework, including adapted Gemma 2 2B and 9B models as well as a set of newly trained T5-sized models (Small, Base, Large and XL). Both pretrained and instruction-tuned T5Gemma models are released publicly. In this video, I talk about the following: How are the T5-Gemma models trained? How do the T5-Gemma models perform? For more details, please look at https://arxiv.org/pdf/2504.06225 and https://developers.googleblog.com/en/... Zhang, Biao, Fedor Moiseev, Joshua Ainslie, Paul Suganthan, Min Ma, Surya Bhupatiraju, Fede Lebron, Orhan Firat, Armand Joulin, and Zhe Dong. "Encoder-Decoder Gemma: Improving the Quality-Efficiency Trade-Off via Adaptation." arXiv preprint arXiv:2504.06225 (2025). Thanks for watching! LinkedIn: http://aka.ms/manishgupta HomePage: https://sites.google.com/view/manishg/