У нас вы можете посмотреть бесплатно Colloquium: Brain-inspired sparse network science for next generation efficient and sustainable AI или скачать в максимальном доступном качестве, видео которое было загружено на ютуб. Для загрузки выберите вариант из формы ниже:
Если кнопки скачивания не
загрузились
НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием видео, пожалуйста напишите в поддержку по адресу внизу
страницы.
Спасибо за использование сервиса ClipSaver.ru
IAIFI Colloquium Sign up for IAIFI's mailing list: https://mailman.mit.edu/mailman/listi... -- Carlo V. Cannistraci, Chair Professor, Tsinghua Laboratory of Brain and Intelligence (THBI) Brain-inspired sparse network science for next generation efficient and sustainable AI Artificial neural networks (ANNs) are foundational to contemporary artificial intelligence (AI), however their conventional fully connected architectures are computationally inefficient. Contemporary large language models consume vast amounts of power at rates over 100 times that of the human brain. In stark contrast, the brain’s inherently sparse connectivity facilitates exceptional capabilities with minimal expenditure: learning with just a few watts. Brain-inspired network science research can play a relevant role in designing low-consumption and efficient deep learning. We need to develop concepts and theories for an ecological and sustainable approach to AI. Some of these new computing paradigms can be inspired from the physics of the brain network architecture and its complex systems biology. At the Center for Complex Network Intelligence (CCNI) within the Tsinghua Laboratory of Brain and Intelligence (THBI), our research focuses on three pivotal features of brain networks that contribute to efficient computation: (1) Connectivity Sparsity: Implementing sparse connections to reduce computational overhead while maintaining performance; (2) Connectivity Morphology: Exploring the spatial patterns of neural connections to optimize information processing; (3) Neuro-Glia Coupling: Investigating the interactions between neurons and glial cells to enhance computational efficiency. This talk will introduce the Cannistraci-Hebb Training soft rule (CHTs), a brain-inspired network science theory that employs a gradient-free approach, relying solely on network topology to predict sparse connectivity during dynamic sparse training. CHTs have demonstrated the potential to achieve ultra-sparse networks with approximately 1% connectivity, outperforming fully connected networks in various tasks. Additionally, we will discuss our recent study on the relationship between sparse morphological connectivity and spatiotemporal intelligence. This research introduces neuromorphic dendritic network computation with silent synapses, a model that emulates visual motion perception by integrating synaptic organization with dendritic tree-like morphology. The model exhibits exceptional performance in visual motion perception tasks, underscoring the potential of bio-inspired approaches to enhance the transparency and efficiency of modern AI systems.