У нас вы можете посмотреть бесплатно 17. Transformers Explained Easily: Part 1 - Generative Music AI или скачать в максимальном доступном качестве, видео которое было загружено на ютуб. Для загрузки выберите вариант из формы ниже:
Если кнопки скачивания не
загрузились
НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием видео, пожалуйста напишите в поддержку по адресу внизу
страницы.
Спасибо за использование сервиса ClipSaver.ru
Learn the intuition, theory, and mathematical formalization of transformer architectures, Transformers are widely used in Deep Learning. They now dominate Natural Language Processing, computer vision, and are extensively used in audio and music processing. Get the lecture slides: https://github.com/musikalkemist/gene... Website of the Generative Music AI Workshop in Barcelona: https://www.upf.edu/web/mtg/generativ... Sign up to The Sound of AI Slack Community to join the discussion: https://valeriovelardo.com/the-sound-... ====================================== Interested in music AI consulting? https://thesoundofai.com/consulting.html Interested in music AI recruitment? https://thesoundofai.com/recruitment.... Become a Python ninja with my Advanced Python Programming course: https://the-sound-of-ai-academy.teach... Connect with Valerio on LinkedIn: / valeriovelardo Follow Valerio on Twitter: / musikalkemist ====================================== Content 0:00 Intro 2:57 Context 6:01 The intuition 8:30 Encoder 12:15 Encoder block 12:42 Self-attention 16:36 Matrices 17:04 Input matrix 18:29 Query, key, value matrices 21:20 Self-attention formula 22:01 Self-attention: Step 1 28:07 Self-attention: Step 2 30:02 Self-attention: Step 3 32:59 Self-attention: Step 4 45:45 Self-attention: Visual recap 47:47 Multi-head attention 50:49 The propblem of sequence order 52:33 Positional encoding 55:30 How to compute positional encoding 1:02:01 Feedforward layer 1:03:27 Add & norm layer 1:06:09 Deeper meaning of encoder components 1:07:10 Encoder step-by-step 1:10:37 Key takeaways 1:12:20 What next?