pith. machine review for the scientific record. sign in

arxiv: 1906.01282 · v1 · submitted 2019-06-04 · 💻 cs.CL

Recognition: unknown

Lattice-Based Transformer Encoder for Neural Machine Translation

Authors on Pith no claims yet
classification 💻 cs.CL
keywords differentlattice-basedsegmentationstransformertranslationencoderencodersmachine
0
0 comments X
read the original abstract

Neural machine translation (NMT) takes deterministic sequences for source representations. However, either word-level or subword-level segmentations have multiple choices to split a source sequence with different word segmentors or different subword vocabulary sizes. We hypothesize that the diversity in segmentations may affect the NMT performance. To integrate different segmentations with the state-of-the-art NMT model, Transformer, we propose lattice-based encoders to explore effective word or subword representation in an automatic way during training. We propose two methods: 1) lattice positional encoding and 2) lattice-aware self-attention. These two methods can be used together and show complementary to each other to further improve translation performance. Experiment results show superiorities of lattice-based encoders in word-level and subword-level representations over conventional Transformer encoder.

This paper has not been read by Pith yet.

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.