У нас вы можете посмотреть бесплатно Aaditya Ramdas (Carnegie Mellon) - LIDS Student Conference 2026 или скачать в максимальном доступном качестве, видео которое было загружено на ютуб. Для загрузки выберите вариант из формы ниже:
Если кнопки скачивания не
загрузились
НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием видео, пожалуйста напишите в поддержку по адресу внизу
страницы.
Спасибо за использование сервиса ClipSaver.ru
Talk Title: "A complete generalization of Kelly betting" Abstract: In the 1950s, John Kelly (working at Bell Labs like Claude Shannon) fundamentally connected gambling on coin tosses with Shannon's information theory, and this was soon extended by Leo Breiman (1963) to more general settings. In an excellent 1999 Yale PhD thesis, Jonathan Li defined and studied a concept called the reverse information projection (RIPr), which is a Kullback-Leibler projection of a given probability measure onto a given set of probability measures. Grunwald et al. (2024) showed that the RIPr characterizes the log-optimal bet/e-variable of a point alternative hypothesis against a composite null hypothesis, albeit under several assumptions (convexity, absolute continuity, etc). In this talk, we will show how to fully and completely generalize the theory underlying Kelly betting and the RIPr, showing that the RIPr is always well defined, without any assumptions. Further, a strong duality result identifies it as the dual to an optimal bet/e-variable called the numeraire, which is unique and also always exists without assumptions. This fully generalizes classical Kelly betting to arbitrary composite nulls; the same assumptionless strong duality also holds for Renyi/Hellinger projections (replacing the logarithmic utility by power utilities). The talk will not assume any prior knowledge on these topics. This is joint work with Martin Larsson and Johannes Ruf, and appeared in the Annals of Statistics (2025). --------------------------------------------------- Biography: Aaditya Ramdas is a tenured Associate Professor in the Department of Statistics and Data Science and the Machine Learning Department at Carnegie Mellon University. He was a postdoc at UC Berkeley (2015–2018) mentored by Michael Jordan and Martin Wainwright, and obtained his PhD at CMU (2010–2015) under Aarti Singh and Larry Wasserman, receiving the Umesh K. Gavaskar Memorial Thesis Award. His work has been recognized by the Presidential Early Career Award (PECASE), NSF CAREER award, and a Sloan Fellowship in Mathematics. His research focuses on algorithms with theoretical guarantees in mathematical statistics and learning, specifically post-selection inference, game-theoretic statistics, and predictive uncertainty quantification.