У нас вы можете посмотреть бесплатно Introduction to Neural Networks (Lecture 8) или скачать в максимальном доступном качестве, видео которое было загружено на ютуб. Для загрузки выберите вариант из формы ниже:
Если кнопки скачивания не
загрузились
НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием видео, пожалуйста напишите в поддержку по адресу внизу
страницы.
Спасибо за использование сервиса ClipSaver.ru
Welcome to the eigth lecture of my Deep Learning series! 🧠 In this video, we dive deep into the Heart of the Neural Network: Backpropagation. Building upon the computational graph we visualized in the last lecture, we continue our manual calculation of gradients, moving from the end of the graph back to the input layers. We focus on the Chain Rule of Calculus, which is the mathematical engine behind backpropagation. We learn that to find how the final Loss changes with respect to any input (like a or b), we simply multiply the Local Derivative (the immediate impact of the operation) by the Upstream Gradient (the gradient flowing back from the end). To ensure our intuition is perfect, we don't just solve the equations; we verify every single step using Python code. We calculate the slope numerically (using h=0.0001) to prove that our analytical calculations using the Chain Rule are accurate. By the end, we derive the gradient rules for Addition (which routes gradients) and Multiplication (which scales gradients). In this video, we cover: ✅ The Chain Rule: Understanding the "Heart of Backpropagation" and how to multiply local and upstream gradients. ✅ Manual Backprop: Calculating the gradients for nodes c, e, a, and b step-by-step. ✅ Numerical Verification: Using the definition of a derivative in Python to prove our calculus is correct. ✅ Gradients of Operations: Why the gradient of addition is 1.0 (gradient flows equally) and the gradient of multiplication is the value of the other term. ✅ Completing the Graph: Manually populating the .grad property for every node in our simple neural network. Resources: 🔗 GitHub Repository (Code & Notes): https://github.com/gautamgoel962/Yout... 🔗 Follow me on Instagram: / gautamgoel978 Basic understanding of slopes and derivatives. Subscribe to continue the journey! Now that we have calculated these gradients by hand, in the next lecture, we will write the magic backward() function to automate this entire process in our code! 📉🚀 #DeepLearning #Python #Backpropagation #Micrograd #ChainRule #Calculus #GradientDescent #Hindi #AI #MachineLearning