У нас вы можете посмотреть бесплатно Guide to Tuning the Many Hyperparameters of a Genetic Algorithm (GA) или скачать в максимальном доступном качестве, видео которое было загружено на ютуб. Для загрузки выберите вариант из формы ниже:
Если кнопки скачивания не
загрузились
НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием видео, пожалуйста напишите в поддержку по адресу внизу
страницы.
Спасибо за использование сервиса ClipSaver.ru
In a Genetic Algorithm (GA), there are five key hyperparameters – population size, number of parents, number of elites, crossover rate, and mutation rate – along with hyperparameters of a selection operator that adjust so-called selection pressure. In this video, I describe the collective effect of these 6 hyperparameters of the performance of a Genetic Algorithm. I describe how the population size (M) represents a computational cost paid to increase the general accuracy of an algorithm, allowing it to innovate through increased capacity. However, within a given population size, the other parameters adjust the dynamics of that search. The number of parents (R) sets up the amount of background information retention in the system, such that the difference M-R (which I call reproductive skew) sets up the potential for exploration of new solutions. That novelty is only possible by having mutation, set by the mutation rate (Pm), with the shape of trajectories to new candidate solutions being significantly modulated by the crossover rate (Pc) that happens to have little effect when there is no diversity left in the system. On top of all of these parameters is the selection pressure (tuned in different ways for different selection operators), which represents how much greedy pressure there is for satisficing (i.e., converging on a good enough local solution as opposed to searching for a better global solution). I try to capture all of this in different graphical frameworks to help remember how these parameters relate to exploration and exploitation/fine tuning, and I close with a characterization of evolutionary systems (in general) in drift fields that inevitably switch from exploration to exploitation to random steady-state movement. It is the goal of the operations researcher employing the optimization metaheuristic to tune hyperparameters to best navigate this "drift field" space. This video was recorded by Theodore P. Pavlic to support IEE/CSE 598 (Bio-Inspired AI and Optimization) at Arizona State University.