pith. machine review for the scientific record. sign in

arxiv: 1611.01462 · v3 · submitted 2016-11-04 · 💻 cs.LG · cs.CL· stat.ML

Recognition: unknown

Tying Word Vectors and Word Classifiers: A Loss Framework for Language Modeling

Authors on Pith no claims yet
classification 💻 cs.LG cs.CLstat.ML
keywords frameworklanguagemodelingwordinputleadslearningmodels
0
0 comments X
read the original abstract

Recurrent neural networks have been very successful at predicting sequences of words in tasks such as language modeling. However, all such models are based on the conventional classification framework, where the model is trained against one-hot targets, and each word is represented both as an input and as an output in isolation. This causes inefficiencies in learning both in terms of utilizing all of the information and in terms of the number of parameters needed to train. We introduce a novel theoretical framework that facilitates better learning in language modeling, and show that our framework leads to tying together the input embedding and the output projection matrices, greatly reducing the number of trainable variables. Our framework leads to state of the art performance on the Penn Treebank with a variety of network models.

This paper has not been read by Pith yet.

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Forward citations

Cited by 2 Pith papers

Reviewed papers in the Pith corpus that reference this work. Sorted by Pith novelty score.

  1. The Falcon Series of Open Language Models

    cs.CL 2023-11 conditional novelty 6.0

    Falcon-180B is a 180B-parameter open decoder-only model trained on 3.5 trillion tokens that approaches PaLM-2-Large performance at lower cost and is released with dataset extracts.

  2. CTRL: A Conditional Transformer Language Model for Controllable Generation

    cs.CL 2019-09 unverdicted novelty 6.0

    CTRL is a large conditional transformer language model that uses naturally occurring control codes to steer text generation style and content.