У нас вы можете посмотреть бесплатно Generative Latent Neural PDE Solver using Flow Matching или скачать в максимальном доступном качестве, видео которое было загружено на ютуб. Для загрузки выберите вариант из формы ниже:
Если кнопки скачивания не
загрузились
НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием видео, пожалуйста напишите в поддержку по адресу внизу
страницы.
Спасибо за использование сервиса ClipSaver.ru
Link to Arxiv Paper: https://arxiv.org/abs/2503.22600 Link to Colab Notebook: https://colab.research.google.com/dri... This video discusses a research paper from Carnegie Mellon University about using diffusion transformers to solve partial differential equations (PDEs). Here's a breakdown of the video: The presenter explains the core concept: training a diffusion model on PDEs, shrinking them into a 2D space, and then having the model output abstracts of these PDEs [01:52]. The model creates probabilistic blends of the PDEs it has been trained on, but it cannot produce novel PDEs [02:44]. The presenter uses an analogy of putting a cat in Schrödinger's box to explain how the model works [03:13]. The video highlights the limitations of AI models, showing how they struggle with certain tasks, unlike the GPT4 model [04:39]. The presenter attempts to reconstruct the paper's findings, but simplifies the model due to computational limitations [06:22]. The video explores two approaches for modeling and predicting the evolution of PDEs using deep learning [07:05]: Flow matching with diffusion transformers [07:13] Swarm diffusion [08:10] The presenter explains the architecture of the diffusion transformer, which involves adding a diffusion block within the feed-forward layer of a standard transformer [09:46]. The video compares the performance of the two approaches, showing that the swarm diffusion model achieves better results with less computational cost [12:29]. The presenter concludes by discussing the nature of diffusion in AI, questioning whether it constitutes true intelligence [14:32].