У нас вы можете посмотреть бесплатно MC Seminar| Alain Durmus| Toward principled inference and convergence guarantees in diffusion models или скачать в максимальном доступном качестве, видео которое было загружено на ютуб. Для загрузки выберите вариант из формы ниже:
Если кнопки скачивания не
загрузились
НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием видео, пожалуйста напишите в поддержку по адресу внизу
страницы.
Спасибо за использование сервиса ClipSaver.ru
Online Monte Carlo Seminar [sites.google.com/view/monte-carlo-seminar/] Speaker: Alain Oliviero-Durmus (Ecole Polytechnique) Title: Toward Principled Inference and Convergence Guarantees in Diffusion Models Time: 8:30 am PT = 11:30 am ET = 4:30 pm London = 5:30 pm Paris = 11:30 pm Beijing. Abstract: My talk will be based on two contributions on diffusion models and their application to Bayesian inference. In the first part of this talk, I will present new KL convergence guarantees under minimal assumptions for both for continuous and discrete score-based diffusion models. Specifically, I focus on discretizations derived from the Ornstein-Uhlenbeck semigroup and its kinetic variant, and show that sharp and explicit KL bounds can be obtained for any data distribution with finite Fisher information—thereby avoiding early stopping, smoothing, or strong regularity conditions. In the second part, I will shift to the use of diffusion models for solving inverse problems, such as image reconstruction or source separation. Here, I introduce a novel mixture-based posterior sampling framework that combines diffusion priors with observational data using a principled Gibbs sampling scheme. This approach offers theoretical guarantees, task-agnostic applicability, and robust performance across a wide range of problems—including ten image restoration tasks and musical source separation—without relying on crude approximations or heavy heuristic tuning. #generativemodel #generativeai #diffusionmodels