Recognition: unknown
Edge Contraction Pooling for Graph Neural Networks
read the original abstract
Graph Neural Network (GNN) research has concentrated on improving convolutional layers, with little attention paid to developing graph pooling layers. Yet pooling layers can enable GNNs to reason over abstracted groups of nodes instead of single nodes. To close this gap, we propose a graph pooling layer relying on the notion of edge contraction: EdgePool learns a localized and sparse hard pooling transform. We show that EdgePool outperforms alternative pooling methods, can be easily integrated into most GNN models, and improves performance on both node and graph classification.
This paper has not been read by Pith yet.
Forward citations
Cited by 2 Pith papers
-
Hierarchical Multi-Scale Graph Neural Networks: Scalable Heterophilous Learning with Oversmoothing and Oversquashing Mitigation
HMH builds soft hierarchies with orthonormal Haar bases and heterophily-aware encoders to apply learnable spectral filters while using skip unpooling to avoid oversmoothing and hub bias on heterophilous graphs.
-
The Role of Node Features in Graph Pooling
Pooling improves graph classification only when node features align well with topology, and the authors provide a quantitative measure of this alignment quality.
discussion (0)
Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.