pith. machine review for the scientific record. sign in

ACL Workshop on Representation Learning for NLP (RepL4NLP) , year=

1 Pith paper cite this work. Polarity classification is still indexing.

1 Pith paper citing it

fields

cs.CL 1

years

2019 1

verdicts

ACCEPT 1

representative citing papers

Language Models as Knowledge Bases?

cs.CL · 2019-09-03 · accept · novelty 7.0

BERT stores relational knowledge extractable via cloze queries without fine-tuning and matches supervised baselines on open-domain QA tasks.

citing papers explorer

Showing 1 of 1 citing paper.

  • Language Models as Knowledge Bases? cs.CL · 2019-09-03 · accept · none · ref 76

    BERT stores relational knowledge extractable via cloze queries without fine-tuning and matches supervised baselines on open-domain QA tasks.