pith. machine review for the scientific record. sign in

arXiv preprint , year=

1 Pith paper cite this work. Polarity classification is still indexing.

1 Pith paper citing it

citation-role summary

baseline 1

citation-polarity summary

fields

cs.CL 1

years

2020 1

verdicts

ACCEPT 1

roles

baseline 1

polarities

baseline 1

representative citing papers

Longformer: The Long-Document Transformer

cs.CL · 2020-04-10 · accept · novelty 7.0

Longformer uses local windowed attention plus task-specific global attention to achieve linear scaling and state-of-the-art results on long-document language modeling, QA, and summarization after pretraining.

citing papers explorer

Showing 1 of 1 citing paper.

  • Longformer: The Long-Document Transformer cs.CL · 2020-04-10 · accept · none · ref 5

    Longformer uses local windowed attention plus task-specific global attention to achieve linear scaling and state-of-the-art results on long-document language modeling, QA, and summarization after pretraining.