У нас вы можете посмотреть бесплатно Machine Learning Workshop Part 2 | Neural Networks & Regression или скачать в максимальном доступном качестве, видео которое было загружено на ютуб. Для загрузки выберите вариант из формы ниже:
Если кнопки скачивания не
загрузились
НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием видео, пожалуйста напишите в поддержку по адресу внизу
страницы.
Спасибо за использование сервиса ClipSaver.ru
Welcome to Part 2 of our two-day Machine Learning Workshop, where we dive deeper into the world of Neural Networks and advanced Linear Regression techniques using JASP. Led by Dr. Vahid Aryadoust, this session builds on the foundation laid in Part 1, focusing on splitting data into training and testing datasets to enhance predictive accuracy. In this video, Dr. Aryadoust will guide you through the intricacies of neural networks, explaining key concepts such as neurons, activation functions, summation functions, bias, weights, and loss functions. You will also gain insights into supervised and unsupervised learning, including their applications in classification and regression. Topics Covered: Neurons and Activation Functions: Understanding how neurons process input data and the role of various activation functions like Sigmoid, Sine, Cosine, Inverse Tangent (tanh), ReLU, and Leaky ReLU. Summation Function and Bias: Learn how inputs are combined and adjusted in neural networks. Weights and Loss Function: Explore how weights are assigned and optimized to minimize prediction errors. Supervised Learning: Training models on labeled data for tasks such as spam detection (classification) and house price prediction (regression). Unsupervised Learning: Identifying patterns in unlabeled data. Backpropagation: The standard algorithm for training neural networks. Advanced Training Algorithms: Discover enhanced methods like rprop+ (Resilient Propagation), rprop, grprop-sag (Gradient Resilient Propagation with Stochastic Average Gradient), and grprop-slr (Gradient Resilient Propagation with Stochastic Line Search). Evaluation Metrics: Mean Squared Error (MSE): Measures the average squared differences between actual and predicted values. Root Mean Squared Error (RMSE): The square root of MSE, providing a measure of the standard deviation of residuals. Mean Absolute Error (MAE)/Mean Absolute Deviation (MAD): The average absolute differences between actual and predicted values. Mean Absolute Percentage Error (MAPE): Measures average absolute percentage error, expressed as a percentage. R2: Indicates how well the model's predictions approximate the actual data, with values closer to 1 being better. This workshop is supported by a grant from the UK Association for Language Testing and Assessment (UKALTA). Each session lasts approximately two hours, offering hands-on experience and practical insights into using JASP for machine learning and statistical analysis. Stay engaged and enhance your understanding of machine learning techniques by watching this informative session! #MachineLearning #NeuralNetworks #LinearRegression #JASP #LanguageAssessment #StatisticalInference #PredictiveModeling #Workshop #AdvancedTechniques