BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:On Activation and Normalization Layers in Graph Neural Networks - 
 Moshe Eliasof\, DAMTP\, University of Cambridge
DTSTART:20250519T110000Z
DTEND:20250519T120000Z
UID:TALK231496@talks.cam.ac.uk
CONTACT:Sam Nallaperuma-Herzberg
DESCRIPTION:Graph Neural Networks (GNNs) have attracted growing interest i
 n recent years\, with most work emphasizing model expressiveness and archi
 tectural innovations. By contrast\, essential components such as activatio
 n and normalization layers remain largely unexplored\, often defaulting to
  ReLU and BatchNorm. In our two NeurIPS 2024 papers\, we investigate the e
 ffect of diverse activation and normalization functions on GNN performance
  and introduce two novel\, task- and graph-adaptive layers -- DiGRAF [1] a
 nd GRANOLA [2]. We present the theoretical foundations and design motivati
 ons for these layers and validate their practical benefits through extensi
 ve experiments on a broad suite of graph-learning benchmarks. Our results 
 demonstrate that DiGRAF and GRANOLA consistently outperform conventional a
 lternatives\, highlighting the critical role of adaptive activation and no
 rmalization in advancing GNN performance.
LOCATION:SS03 Seminar Room\, Willam Gates building (Department of Computer
  Science and Technology)
END:VEVENT
END:VCALENDAR
