У нас вы можете посмотреть бесплатно What is a Jacobian-Vector product (jvp) in JAX? или скачать в максимальном доступном качестве, видео которое было загружено на ютуб. Для загрузки выберите вариант из формы ниже:
Если кнопки скачивания не
загрузились
НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием видео, пожалуйста напишите в поддержку по адресу внизу
страницы.
Спасибо за использование сервиса ClipSaver.ru
Often, one is not interested in the full Jacobian matrix of a vector-valued function, but its matrix multiplication with a vector. This operation is represented by the jvp intrinsic in the JAX deep learning framework. Whenever we look at forward mode sensitivity analysis, additional Jacobian derivative information is necessary for the computations at certain points. We can (expensively) obtain the necessary (dense) Jacobian matrices and perform the operations they are involved in. However, often it turns out that the full dense Jacobian matrix is of no particular interest, but only its effect in a matrix product with a vector. For forward mode sensitivities, this vector is right-multiplied (for adjoint mode, it would be left-multiplied). JAX (as well as others Automatic Differentiation frameworks) provides an intrinsic called jvp to perform this Jacobian-Vector-Product highly efficiently by employing forward-mode AD directly through the computation graph of the matrix-vector product. This video will show you what the Jacobian-Vector Product is and what the shapes of the involved quantities are. Then, we will discuss the interface to JAX. Also check out JAX' documentation: https://jax.readthedocs.io/en/latest/... ------- 📝 : Check out the GitHub Repository of the channel, where I upload all the handwritten notes and source-code files (contributions are very welcome): https://github.com/Ceyron/machine-lea... 📢 : Follow me on LinkedIn or Twitter for updates on the channel and other cool Machine Learning & Simulation stuff: / felix-koehler and / felix_m_koehler 💸 : If you want to support my work on the channel, you can become a Patreon here: / mlsim 🪙: Or you can make a one-time donation via PayPal: https://www.paypal.com/paypalme/Felix... ------- Timestamps: 00:00 Intro 00:29 A vector-valued function 00:55 Obtaining the full Jacobian 02:07 Conceptionally performing a Jacobian-Vector Product 03:23 Using jax.jvp 06:36 Outro