On Activation and Normalization Layers in Graph Neural Networks
- đ¤ Speaker: Moshe Eliasof, DAMTP, University of Cambridge
- đ Date & Time: Monday 19 May 2025, 12:00 - 13:00
- đ Venue: SS03 Seminar Room, Willam Gates building (Department of Computer Science and Technology)
Abstract
Graph Neural Networks (GNNs) have attracted growing interest in recent years, with most work emphasizing model expressiveness and architectural innovations. By contrast, essential components such as activation and normalization layers remain largely unexplored, often defaulting to ReLU and BatchNorm. In our two NeurIPS 2024 papers, we investigate the effect of diverse activation and normalization functions on GNN performance and introduce two novel, task- and graph-adaptive layers—DiGRAF [1] and GRANOLA [2]. We present the theoretical foundations and design motivations for these layers and validate their practical benefits through extensive experiments on a broad suite of graph-learning benchmarks. Our results demonstrate that DiGRAF and GRANOLA consistently outperform conventional alternatives, highlighting the critical role of adaptive activation and normalization in advancing GNN performance.
Series This talk is part of the Accelerate Lunchtime Seminar Series series.
Included in Lists
- bld31
- Cambridge Centre for Data-Driven Discovery (C2D3)
- Cambridge talks
- Chris Davis' list
- Guy Emerson's list
- Interested Talks
- Interested Talks
- ndk22's list
- ob366-ai4er
- rp587
- SS03 Seminar Room, Willam Gates building (Department of Computer Science and Technology)
- Trust & Technology Initiative - interesting events
- yk449
Note: Ex-directory lists are not shown.
![[Talks.cam]](/static/images/talkslogosmall.gif)

Moshe Eliasof, DAMTP, University of Cambridge
Monday 19 May 2025, 12:00-13:00