CodeBERT pre-trains a bimodal model on code and text pairs plus unimodal data to achieve state-of-the-art results on natural language code search and code documentation generation.
情境 --- --- 組織/存放辭彙語義知識的恰當框架 (Situation -- A Suitable Framework for Organizing and Positioning Lexical Semantic Knowledge) [In C hinese]
1 Pith paper cite this work. Polarity classification is still indexing.
1
Pith paper citing it
fields
cs.CL 1years
2020 1verdicts
UNVERDICTED 1representative citing papers
citing papers explorer
-
CodeBERT: A Pre-Trained Model for Programming and Natural Languages
CodeBERT pre-trains a bimodal model on code and text pairs plus unimodal data to achieve state-of-the-art results on natural language code search and code documentation generation.