pith. machine review for the scientific record. sign in

arxiv: 1612.06140 · v2 · submitted 2016-12-19 · 💻 cs.CL

Recognition: unknown

Domain Control for Neural Machine Translation

Authors on Pith no claims yet
classification 💻 cs.CL
keywords domaindomainstranslationmachineneuralconsidercontrolimprovements
0
0 comments X
read the original abstract

Machine translation systems are very sensitive to the domains they were trained on. Several domain adaptation techniques have been deeply studied. We propose a new technique for neural machine translation (NMT) that we call domain control which is performed at runtime using a unique neural network covering multiple domains. The presented approach shows quality improvements when compared to dedicated domains translating on any of the covered domains and even on out-of-domain data. In addition, model parameters do not need to be re-estimated for each domain, making this effective to real use cases. Evaluation is carried out on English-to-French translation for two different testing scenarios. We first consider the case where an end-user performs translations on a known domain. Secondly, we consider the scenario where the domain is not known and predicted at the sentence level before translating. Results show consistent accuracy improvements for both conditions.

This paper has not been read by Pith yet.

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Forward citations

Cited by 1 Pith paper

Reviewed papers in the Pith corpus that reference this work. Sorted by Pith novelty score.

  1. CTRL: A Conditional Transformer Language Model for Controllable Generation

    cs.CL 2019-09 unverdicted novelty 6.0

    CTRL is a large conditional transformer language model that uses naturally occurring control codes to steer text generation style and content.