Hypergraph neural networks obey a strict expressivity hierarchy indexed by hypertree width, creating a Width Wall that no fixed-depth model, hidden dimension, or training procedure can cross for wider patterns.
All convex invariant functions of Hermitian matrices.Archiv der Mathematik, 8:276–278, 1957
1 Pith paper cite this work. Polarity classification is still indexing.
1
Pith paper citing it
fields
cs.LG 1years
2026 1verdicts
UNVERDICTED 1representative citing papers
citing papers explorer
-
The WidthWall: A Strict Expressivity Hierarchy for Hypergraph Neural Networks
Hypergraph neural networks obey a strict expressivity hierarchy indexed by hypertree width, creating a Width Wall that no fixed-depth model, hidden dimension, or training procedure can cross for wider patterns.