У нас вы можете посмотреть бесплатно Transformer Neural Networks, ChatGPT's foundation, Clearly Explained!!! или скачать в максимальном доступном качестве, видео которое было загружено на ютуб. Для загрузки выберите вариант из формы ниже:
Если кнопки скачивания не
загрузились
НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием видео, пожалуйста напишите в поддержку по адресу внизу
страницы.
Спасибо за использование сервиса ClipSaver.ru
Transformer Neural Networks are the heart of pretty much everything exciting in AI right now. ChatGPT, Google Translate and many other cool things, are based on Transformers. This StatQuest cuts through all the hype and shows you how a Transformer works, one-step-at-a time. NOTE: If you're interested in learning more about Backpropagation, check out these 'Quests: The Chain Rule: • The Chain Rule Gradient Descent: • Gradient Descent, Step-by-Step Backpropagation Main Ideas: • Neural Networks Pt. 2: Backpropagatio... Backpropagation Details Part 1: • Backpropagation Details Pt. 1: Optimi... Backpropagation Details Part 2: • Backpropagation Details Pt. 2: Going ... If you're interested in learning more about the SoftMax function, check out: • Neural Networks Part 5: ArgMax and So... If you're interested in learning more about Word Embedding, check out: • Word Embedding and Word2Vec, Clearly ... If you'd like to learn more about calculating similarities in the context of neural networks and the Dot Product, check out: Cosine Similarity: • Cosine Similarity, Clearly Explained!!! Attention: • Attention for Neural Networks, Clearl... For a complete index of all the StatQuest videos, check out: https://statquest.org/video-index/ If you'd like to support StatQuest, please consider... Patreon: / statquest ...or... YouTube Membership: / @statquest ...buying one of my books, a study guide, a t-shirt or hoodie, or a song from the StatQuest store... https://statquest.org/statquest-store/ ...or just donating to StatQuest! https://www.paypal.me/statquest Lastly, if you want to keep up with me as I research and create new StatQuests, follow me on twitter: / joshuastarmer 0:00 Awesome song and introduction 1:26 Word Embedding 7:30 Positional Encoding 12:53 Self-Attention 23:37 Encoder and Decoder defined 23:53 Decoder Word Embedding 25:08 Decoder Positional Encoding 25:50 Transformers were designed for parallel computing 27:13 Decoder Self-Attention 27:59 Encoder-Decoder Attention 31:19 Decoding numbers into words 32:23 Decoding the second token 34:13 Extra stuff you can add to a Transformer #StatQuest #Transformer #ChatGPT