pith. machine review for the scientific record. sign in

Multilayer feedforward networks with a nonpolynomial activation function can approximate any function

1 Pith paper cite this work. Polarity classification is still indexing.

1 Pith paper citing it

fields

cs.LG 1

years

2021 1

verdicts

CONDITIONAL 1

representative citing papers

How Attentive are Graph Attention Networks?

cs.LG · 2021-05-30 · conditional · novelty 7.0

GAT uses static attention where neighbor rankings ignore the query node and thus cannot express some graph problems; GATv2 enables dynamic attention and outperforms GAT on 11 OGB and other benchmarks with equal parameters.

citing papers explorer

Showing 1 of 1 citing paper.

  • How Attentive are Graph Attention Networks? cs.LG · 2021-05-30 · conditional · none · ref 34

    GAT uses static attention where neighbor rankings ignore the query node and thus cannot express some graph problems; GATv2 enables dynamic attention and outperforms GAT on 11 OGB and other benchmarks with equal parameters.