У нас вы можете посмотреть бесплатно Full Jacobian Matrix using forward-mode AD in JAX или скачать в максимальном доступном качестве, видео которое было загружено на ютуб. Для загрузки выберите вариант из формы ниже:
Если кнопки скачивания не
загрузились
НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием видео, пожалуйста напишите в поддержку по адресу внизу
страницы.
Спасибо за использование сервиса ClipSaver.ru
Let's use the primitive of forward-mode Automatic Differentiation, the Jacobian-vector product (jax.jvp) to build the full and dense Jacobian matrix of a vector-valued function. We can make the interesting observation that if we multiply any matrix with the i-th unit vector from the right, we can extract the i-th column vector of it. This finding can be used to build a full and dense Jacobian matrix by evaluating jvps for each possible unit vector and then concatenate the columns together. Hence, we have an approach that scales linear in the number of columns (,but constantly in the number of rows). ------- 📝 : Check out the GitHub Repository of the channel, where I upload all the handwritten notes and source-code files (contributions are very welcome): https://github.com/Ceyron/machine-lea... 📢 : Follow me on LinkedIn or Twitter for updates on the channel and other cool Machine Learning & Simulation stuff: / felix-koehler and / felix_m_koehler 💸 : If you want to support my work on the channel, you can become a Patreon here: / mlsim 🪙: Or you can make a one-time donation via PayPal: https://www.paypal.com/paypalme/Felix... ------- Timestamps: 00:00 Intro 00:20 A vector-valued function 00:49 Jacobian using JAX function (jax.jacfwd) 01:50 A Jacobian-vector product (jvp) 03:20 Extracting Jacobian columns using unit vectors 05:25 Implement Jacobian function 07:49 Comparing own implementation with JAX' 08:25 Some considerations (and complexity) 09:34 Outro