pith. machine review for the scientific record. sign in

arxiv: 1512.08301 · v2 · submitted 2015-12-28 · 💻 cs.NE

Recognition: unknown

Feedforward Sequential Memory Networks: A New Structure to Learn Long-term Dependency

Authors on Pith no claims yet
classification 💻 cs.NE
keywords memorystructurefeedforwardfsmnsnetworksneuralsequentialblocks
0
0 comments X
read the original abstract

In this paper, we propose a novel neural network structure, namely \emph{feedforward sequential memory networks (FSMN)}, to model long-term dependency in time series without using recurrent feedback. The proposed FSMN is a standard fully-connected feedforward neural network equipped with some learnable memory blocks in its hidden layers. The memory blocks use a tapped-delay line structure to encode the long context information into a fixed-size representation as short-term memory mechanism. We have evaluated the proposed FSMNs in several standard benchmark tasks, including speech recognition and language modelling. Experimental results have shown FSMNs significantly outperform the conventional recurrent neural networks (RNN), including LSTMs, in modeling sequential signals like speech or language. Moreover, FSMNs can be learned much more reliably and faster than RNNs or LSTMs due to the inherent non-recurrent model structure.

This paper has not been read by Pith yet.

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Forward citations

Cited by 1 Pith paper

Reviewed papers in the Pith corpus that reference this work. Sorted by Pith novelty score.

  1. UAF: A Unified Audio Front-end LLM for Full-Duplex Speech Interaction

    cs.AI 2026-04 unverdicted novelty 6.0

    UAF is the first unified audio front-end LLM that turns multiple front-end tasks into one sequence prediction model processing streaming audio chunks and reference prompts to output semantic and control tokens for ful...