У нас вы можете посмотреть бесплатно Google PaLM: Scaling Language Modeling with Pathways или скачать в максимальном доступном качестве, видео которое было загружено на ютуб. Для загрузки выберите вариант из формы ниже:
Если кнопки скачивания не
загрузились
НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием видео, пожалуйста напишите в поддержку по адресу внизу
страницы.
Спасибо за использование сервиса ClipSaver.ru
Google PaLM is a 540B parameter dense Transformer language model trained on 780B tokens of high-quality, diverse text. It has been trained for 3 different sizes: 8B, 62B, and 540B using 6144 TPU v4 chips using Pathways, a new ML system for highly efficient training across multiple TPU (Tensor Processing Units) Pods. When it was introduced it led to SOTA few-shot learning results on hundreds of NLU and NLG benchmarks. This includes steep performance increase for Big-Bench tasks, and also significant improvements for Multilingual NLG and source code generation capabilities. It has also been shown to generate great explanations using chain-of-thought prompting for explaining jokes or logical inference. Here is the agenda for the video 00:00:00 What is PaLM? 00:02:50 How does PaLM’s model architecture look like? 00:04:45 What is the Pathways system? 00:07:49 How does PaLM perform on popular English NLP tasks? 00:09:29 How does PaLM perform on Massive Multitask Language Understanding? 00:10:45 How does PaLM perform on Big bench tasks? 00:13:35 How does PaLM perform on Reasoning tasks? 00:16:52 How does PaLM perform on Coding tasks? 00:26:07 How does PaLM perform on Translation tasks? 00:29:27 How does PaLM perform on Multilingual NLG Tasks? 00:32:27 Is PaLM memorizing data? 00:35:40 What kinds of explanations PaLM can generate? 00:37:35 How does PaLM perform from a bias and toxicity perspective? For more details, please look at https://arxiv.org/pdf/2204.02311.pdf and https://ai.googleblog.com/2022/04/pat... Chowdhery, Aakanksha, Sharan Narang, Jacob Devlin, Maarten Bosma, Gaurav Mishra, Adam Roberts, Paul Barham et al. "Palm: Scaling language modeling with pathways." arXiv preprint arXiv:2204.02311 (2022).