Polynomial Sheaf Hypergraph Neural Network
- 👤 Speaker: Leo-Minh Kustermann (University of Cambridge)
- 📅 Date & Time: Tuesday 17 February 2026, 18:00 - 19:00
- 📍 Venue: Computer Laboratory, William Gates Building, FW11 And Remote
Abstract
recording: https://youtu.be/k2qGIOLUkfs
Our architecture is based on a corrected formulation of the sheaf hypergraph Laplacian and systematically investigates the effects of polynomial degree, diffusion-based versus direct filtering formulations, and structural constraints such as stalk dimensionality.
Contrary to our initial hypothesis, experiments on seven benchmark datasets show that polynomial multi-hop filtering does not consistently outperform one-hop sheaf diffusion baselines, particularly on homophilic citation networks. While polynomial filters are theoretically expressive and capable of approximating arbitrary spectral responses, jointly learning all coefficients induces a highly non-convex optimisation landscape, leading to instability, local minima, and higher variance.
These results suggest that maximising expressivity through unconstrained polynomial parameterisation can undermine practical performance. We therefore advocate for structured learning strategies-such as frequency-band decomposition, spatial pooling, or hybrid spectral–spatial designs-that retain expressive power while introducing inductive biases to stabilise optimisation and improve generalisation on hypergraph-structured data.
Series This talk is part of the Polynomial Sheaf Hypergraph Neural Networks series.
Included in Lists
- Computer Laboratory, William Gates Building, FW11 And Remote
- Lecture Theatre 1, Computer Laboratory, William Gates Building and Zoom
- Polynomial Sheaf Hypergraph Neural Networks
Note: Ex-directory lists are not shown.
![[Talks.cam]](/static/images/talkslogosmall.gif)


Tuesday 17 February 2026, 18:00-19:00