У нас вы можете посмотреть бесплатно Kernelization of Natural Gradient Methods for PIML || Oct 10, 2025 или скачать в максимальном доступном качестве, видео которое было загружено на ютуб. Для загрузки выберите вариант из формы ниже:
Если кнопки скачивания не
загрузились
НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием видео, пожалуйста напишите в поддержку по адресу внизу
страницы.
Спасибо за использование сервиса ClipSaver.ru
Speaker, institute & title 1) Nilo Schwencke, Paris-Saclay University, Kernelization of Natural Gradient Methods for Physics Informed Machine-Learning with connections to Galerkin schemes. Abstract: We present three different contributions: a) Through a thorough analysis of the training dynamics of Physics-Informed Neural Networks (PINNs) under Natural Gradient, we introduce the notion of the Natural Neural Tangent Kernel (NNTK). We leverage this to define the empirical Tangent Space and empirical Natural Gradient, yielding the AnaGRAM algorithm, which scales as O(min(PS², SP²)) with P being the number of parameters and S the number of samples. We also prove connections between the natural gradient of PINNs and Green’s function theory. b) Building on a deep empirical analysis of the training dynamics of the AnaGRAM algorithm, we propose an adaptive cutoff-regularization scheme denoted as AMStraMGRAM. This scheme improves the performance of the AnaGRAM algorithm up to machine-error precision for simple problems. c) From a more theoretical point of view, we introduce a unifying theory based on the notion of kernelization of the Natural Gradient that encompasses Galerkin methods. As a byproduct, we show that strong and weak formulations can be understood in the same unifying framework, yielding a generalization of our previous results on Green’s functions.