The Problem of Size Generalization in Graph Neural Networks
Add to your list(s)
Download to your calendar using vCal
If you have a question about this talk, please contact Pietro Lio.
Recording: https://www.youtube.com/watch?v=W4jNbgtBl_k
In the past few years, graph neural networks (GNNs) have become the de facto model of choice for graph classification and other tasks on graph structured data. While, from the theoretical viewpoint, most GNNs can operate on graphs of any size, it is empirically observed that their classification performance degrades when they are applied on graphs with sizes that differ from those in the training data. In this talk we will give an overview of the current approaches to tackle the issue of poor size-generalization in GNNs, and we will introduce our recent work in this area.
This talk is part of the Artificial Intelligence Research Group Talks (Computer Laboratory) series.
This talk is included in these lists:
Note that ex-directory lists are not shown.
|