У нас вы можете посмотреть бесплатно [IRIS-AI 25 - Keynote 1] Egor V. Kostylev - Uniform expressivity of Graph Neural Networks или скачать в максимальном доступном качестве, видео которое было загружено на ютуб. Для загрузки выберите вариант из формы ниже:
Если кнопки скачивания не
загрузились
НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием видео, пожалуйста напишите в поддержку по адресу внизу
страницы.
Спасибо за использование сервиса ClipSaver.ru
Keynote 1 at IRIS-AI 25, by Egor V. Kostylev. [Title] Uniform expressivity of Graph Neural Networks [Abstract] Graph Neural Networks (GNNs) are a recent family of neural architectures that are naturally suited to learning functions on graphs. They are now used in a wide range of applications. It has been observed that GNNs share many similarities with classical computer science (CS) formalisms, such as the Weisfeiler-Leman graph isomorphism test, bisimulation, and logic. Most notably, both GNNs and these formalisms deal with functions on graphs and graph-like structures. This observation opens up an opportunity to compare GNN architectures with these formalisms in terms of different kinds of expressibility, thereby positioning these architectures within the well-established landscape of theoretical CS. This, in turn, helps us better understand the fundamental capabilities and limitations of various GNN architectures, enabling more informed choices about which architecture to use—if any at all. In this talk, we will have a look at the foundations of GNNs in terms of the strongest notion of expressibility—uniform expressive power. We begin with an introduction to GNNs as a mathematical object, where we will see that it is, in fact, a rich family of architectures with various properties. We then briefly discuss how we can measure expressivity of these architectures, also touching upon notions other than uniform expressivity. In the main part of the talk, we will compare the expressivity of various classes of GNNs with a special type of bisimulation and various logics, such as Graded Modal Logic and its extensions. Finally, we will discuss promising directions and open questions related to the foundations of GNNs. [Speaker's bio] Prof. Egor V. Kostylev (Department of Informatics, University of Oslo, Norway) obtained his Specialist (2004) and PhD (2009) degrees from the Faculty of Computational Mathematics and Cybernetics, the Lomonosov Moscow State University, in the area of Discrete Mathematics and Mathematical Cybernetics. After this and before starting his current position, he was a researcher at Laboratory for Foundations of Computer Science, School of Informatics, University of Edinburgh, UK (2010-2013), and Departmental Lecturer at the Department of Computer Science, University of Oxford, UK (2013-2020). Since 2025, he is also a research theme leader at Norwegian Centre for Knowledge-driven Machine Learning Integreat. His research interests include the theory of AI, in particular, knowledge representation and reasoning (including knowledge graphs and ontologies), and its connections to neural networks (including graph NNs), databases, and semantic technologies. He authored more than 80 scientific papers published in top journals and conferences in the field, two of which received best paper awards (ICDT’16 and IJCAI’17). Event's website: iris-ai.org