pith. machine review for the scientific record. sign in

arxiv: 1709.01915 · v1 · submitted 2017-09-06 · 💻 cs.CL · cs.AI

Recognition: unknown

Towards Neural Machine Translation with Latent Tree Attention

Authors on Pith no claims yet
classification 💻 cs.CL cs.AI
keywords annotationattentionallanguagemachinemodelneuralparsesegmentation
0
0 comments X
read the original abstract

Building models that take advantage of the hierarchical structure of language without a priori annotation is a longstanding goal in natural language processing. We introduce such a model for the task of machine translation, pairing a recurrent neural network grammar encoder with a novel attentional RNNG decoder and applying policy gradient reinforcement learning to induce unsupervised tree structures on both the source and target. When trained on character-level datasets with no explicit segmentation or parse annotation, the model learns a plausible segmentation and shallow parse, obtaining performance close to an attentional baseline.

This paper has not been read by Pith yet.

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.