pith. machine review for the scientific record. sign in

arxiv: 1605.01652 · v1 · submitted 2016-05-05 · 💻 cs.AI · cs.CL

Recognition: unknown

LSTM-based Mixture-of-Experts for Knowledge-Aware Dialogues

Authors on Pith no claims yet
classification 💻 cs.AI cs.CL
keywords goodmodelexpertslstm-basedneuralseveralapplicationapproach
0
0 comments X
read the original abstract

We introduce an LSTM-based method for dynamically integrating several word-prediction experts to obtain a conditional language model which can be good simultaneously at several subtasks. We illustrate this general approach with an application to dialogue where we integrate a neural chat model, good at conversational aspects, with a neural question-answering model, good at retrieving precise information from a knowledge-base, and show how the integration combines the strengths of the independent components. We hope that this focused contribution will attract attention on the benefits of using such mixtures of experts in NLP.

This paper has not been read by Pith yet.

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.