pith. machine review for the scientific record. sign in

arxiv: 1904.05811 · v1 · submitted 2019-04-11 · 💻 cs.LG · cs.AI· stat.ML

Recognition: unknown

Relational Graph Attention Networks

Authors on Pith no claims yet
classification 💻 cs.LG cs.AIstat.ML
keywords graphrelationalattentionnetworksevaluationinvestigatemodelsalthough
0
0 comments X
read the original abstract

We investigate Relational Graph Attention Networks, a class of models that extends non-relational graph attention mechanisms to incorporate relational information, opening up these methods to a wider variety of problems. A thorough evaluation of these models is performed, and comparisons are made against established benchmarks. To provide a meaningful comparison, we retrain Relational Graph Convolutional Networks, the spectral counterpart of Relational Graph Attention Networks, and evaluate them under the same conditions. We find that Relational Graph Attention Networks perform worse than anticipated, although some configurations are marginally beneficial for modelling molecular properties. We provide insights as to why this may be, and suggest both modifications to evaluation strategies, as well as directions to investigate for future work.

This paper has not been read by Pith yet.

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Forward citations

Cited by 4 Pith papers

Reviewed papers in the Pith corpus that reference this work. Sorted by Pith novelty score.

  1. All Circuits Lead to Rome: Rethinking Functional Anisotropy in Circuit and Sheaf Discovery for LLMs

    cs.CL 2026-05 unverdicted novelty 7.0

    LLM tasks are supported by multiple distinct circuits rather than unique mechanisms, demonstrated via Overlap-Aware Sheaf Repulsion and the Distributive Dense Circuit Hypothesis.

  2. AFGNN: API Misuse Detection using Graph Neural Networks and Clustering

    cs.SE 2026-04 unverdicted novelty 6.0

    AFGNN detects API misuses in Java code more effectively than prior methods by representing usage as graphs and clustering learned embeddings from self-supervised training.

  3. Attention-based graph neural networks: a survey

    cs.SI 2026-05 unverdicted novelty 5.0

    The survey groups attention-based GNNs into three stages—graph recurrent attention networks, graph attention networks, and graph transformers—while reviewing architectures and future directions.

  4. FOCAL-Attention for Heterogeneous Multi-Label Prediction

    cs.LG 2026-04 unverdicted novelty 5.0

    FOCAL fuses unconstrained coverage attention and meta-path anchoring attention to improve multi-label classification on heterogeneous graphs by resolving semantic dilution versus coverage constraint trade-offs.