pith. machine review for the scientific record. sign in

arxiv: 1708.02312 · v2 · submitted 2017-08-07 · 💻 cs.CL · cs.AI· cs.LG

Recognition: unknown

Shortcut-Stacked Sentence Encoders for Multi-Domain Inference

Authors on Pith no claims yet
classification 💻 cs.CL cs.AIcs.LG
keywords encoderencodersinferencemulti-domainsentenceachievelanguagenatural
0
0 comments X
read the original abstract

We present a simple sequential sentence encoder for multi-domain natural language inference. Our encoder is based on stacked bidirectional LSTM-RNNs with shortcut connections and fine-tuning of word embeddings. The overall supervised model uses the above encoder to encode two input sentences into two vectors, and then uses a classifier over the vector combination to label the relationship between these two sentences as that of entailment, contradiction, or neural. Our Shortcut-Stacked sentence encoders achieve strong improvements over existing encoders on matched and mismatched multi-domain natural language inference (top non-ensemble single-model result in the EMNLP RepEval 2017 Shared Task (Nangia et al., 2017)). Moreover, they achieve the new state-of-the-art encoding result on the original SNLI dataset (Bowman et al., 2015).

This paper has not been read by Pith yet.

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.