pith. machine review for the scientific record. sign in

arxiv: 1810.06351 · v1 · submitted 2018-10-15 · 💻 cs.CL

Recognition: unknown

(Self-Attentive) Autoencoder-based Universal Language Representation for Machine Translation

Authors on Pith no claims yet
classification 💻 cs.CL
keywords languagerepresentationuniversalresultstranslationarchitectureinterlingualloss
0
0 comments X
read the original abstract

Universal language representation is the holy grail in machine translation (MT). Thanks to the new neural MT approach, it seems that there are good perspectives towards this goal. In this paper, we propose a new architecture based on combining variational autoencoders with encoder-decoders and introducing an interlingual loss as an additional training objective. By adding and forcing this interlingual loss, we are able to train multiple encoders and decoders for each language, sharing a common universal representation. Since the final objective of this universal representation is producing close results for similar input sentences (in any language), we propose to evaluate it by encoding the same sentence in two different languages, decoding both latent representations into the same language and comparing both outputs. Preliminary results on the WMT 2017 Turkish/English task shows that the proposed architecture is capable of learning a universal language representation and simultaneously training both translation directions with state-of-the-art results.

This paper has not been read by Pith yet.

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Forward citations

Cited by 1 Pith paper

Reviewed papers in the Pith corpus that reference this work. Sorted by Pith novelty score.

  1. UMAP: Uniform Manifold Approximation and Projection for Dimension Reduction

    stat.ML 2018-02 unverdicted novelty 7.0

    UMAP is a novel, scalable manifold learning algorithm for dimension reduction that competes with t-SNE while preserving more global structure and having no embedding dimension restrictions.