Recognition: unknown
Convolutional Neural Networks for Sentence Classification
read the original abstract
We report on a series of experiments with convolutional neural networks (CNN) trained on top of pre-trained word vectors for sentence-level classification tasks. We show that a simple CNN with little hyperparameter tuning and static vectors achieves excellent results on multiple benchmarks. Learning task-specific vectors through fine-tuning offers further gains in performance. We additionally propose a simple modification to the architecture to allow for the use of both task-specific and static vectors. The CNN models discussed herein improve upon the state of the art on 4 out of 7 tasks, which include sentiment analysis and question classification.
This paper has not been read by Pith yet.
Forward citations
Cited by 5 Pith papers
-
iTAG: Inverse Design for Natural Text Generation with Accurate Causal Graph Annotations
iTAG generates natural text paired with accurate causal graph annotations by framing concept assignment as an inverse problem and refining selections via chain-of-thought reasoning until the text's relations align wit...
-
CodeSearchNet Challenge: Evaluating the State of Semantic Code Search
Releases a large multi-language code corpus and expert-annotated challenge to benchmark semantic code search.
-
DRIFT: Drift-Resilient Invariant-Feature Transformer for DGA Detection
DRIFT uses hybrid character and subword tokenization plus multi-task self-supervised pre-training to build DGA detectors that resist temporal drift and outperform baselines in forward-chaining evaluations over nine ye...
-
CodeXGLUE: A Machine Learning Benchmark Dataset for Code Understanding and Generation
CodeXGLUE supplies a standardized collection of 10 code-related tasks, 14 datasets, an evaluation platform, and BERT-, GPT-, and encoder-decoder-style baselines.
-
CodeBERT: A Pre-Trained Model for Programming and Natural Languages
CodeBERT pre-trains a bimodal model on code and text pairs plus unimodal data to achieve state-of-the-art results on natural language code search and code documentation generation.
discussion (0)
Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.