pith. machine review for the scientific record. sign in

arxiv: 1704.04347 · v3 · submitted 2017-04-14 · 💻 cs.CL

Recognition: unknown

Exploiting Cross-Sentence Context for Neural Machine Translation

Authors on Pith no claims yet
classification 💻 cs.CL
keywords translationapproachcontextcross-sentencedecoderhistoricalmachineneural
0
0 comments X
read the original abstract

In translation, considering the document as a whole can help to resolve ambiguities and inconsistencies. In this paper, we propose a cross-sentence context-aware approach and investigate the influence of historical contextual information on the performance of neural machine translation (NMT). First, this history is summarized in a hierarchical way. We then integrate the historical representation into NMT in two strategies: 1) a warm-start of encoder and decoder states, and 2) an auxiliary context source for updating decoder states. Experimental results on a large Chinese-English translation task show that our approach significantly improves upon a strong attention-based NMT system by up to +2.1 BLEU points.

This paper has not been read by Pith yet.

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.