У нас вы можете посмотреть бесплатно The Key Equation Behind Probability или скачать в максимальном доступном качестве, видео которое было загружено на ютуб. Для загрузки выберите вариант из формы ниже:
Если кнопки скачивания не
загрузились
НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием видео, пожалуйста напишите в поддержку по адресу внизу
страницы.
Спасибо за использование сервиса ClipSaver.ru
Get 4 months extra on a 2 year plan here: https://nordvpn.com/artemkirsanov. It’s risk free with Nord’s 30 day money-back guarantee! ===== My name is Artem, I'm a neuroscience PhD student at Harvard University. 🌎 Website and Social links: https://kirsanov.ai/ 📥 "Receptive Field" neuro-newsletter: https://artemkirsanov.substack.com/ ✨ Support me on Patreon to get access to Discord community: / artemkirsanov ===== In this video, we explore the fundamental concepts that underlie probability theory and its applications in neuroscience and machine learning. We begin with the intuitive idea of surprise and its relation to probability, using real-world examples to illustrate these concepts. From there, we move into more advanced topics: 1) Entropy – measuring the average surprise in a probability distribution. 2) Cross-entropy and the loss of information when approximating one distribution with another. 3) Kullback-Leibler (KL) divergence and its role in quantifying the difference between two probability distributions. 🕒 OUTLINE: 00:00 Introduction 02:00 Sponsor: NordVPN 04:07 What is probability (Bayesian vs Frequentist) 06:42 Probability Distributions 10:17 Entropy as average surprisal 13:53 Cross-Entropy and Internal models 19:20 Kullback–Leier (KL) divergence 20:46 Objective functions and Cross-Entropy minimization 24:22 Conclusion & Outro ===== Special thanks to Crimson Ghoul for providing English subtitles! Icons by https://www.freepik.com/ ===== Disclaimer: This channel is my personal project. The views and content expressed here are my own and are separate from my research role at Harvard University. #probability #entropy #datascience Description remastered: February 2026. Links & Bio updated; original context preserved.