pith. machine review for the scientific record. sign in

arxiv: 1609.01704 · v7 · submitted 2016-09-06 · 💻 cs.LG

Recognition: unknown

Hierarchical Multiscale Recurrent Neural Networks

Authors on Pith no claims yet
classification 💻 cs.LG
keywords hierarchicalmultiscalenetworksneuralrecurrentbeensequencestructure
0
0 comments X
read the original abstract

Learning both hierarchical and temporal representation has been among the long-standing challenges of recurrent neural networks. Multiscale recurrent neural networks have been considered as a promising approach to resolve this issue, yet there has been a lack of empirical evidence showing that this type of models can actually capture the temporal dependencies by discovering the latent hierarchical structure of the sequence. In this paper, we propose a novel multiscale approach, called the hierarchical multiscale recurrent neural networks, which can capture the latent hierarchical structure in the sequence by encoding the temporal dependencies with different timescales using a novel update mechanism. We show some evidence that our proposed multiscale architecture can discover underlying hierarchical structure in the sequences without using explicit boundary information. We evaluate our proposed model on character-level language modelling and handwriting sequence modelling.

This paper has not been read by Pith yet.

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Forward citations

Cited by 3 Pith papers

Reviewed papers in the Pith corpus that reference this work. Sorted by Pith novelty score.

  1. Categorical Reparameterization with Gumbel-Softmax

    stat.ML 2016-11 unverdicted novelty 8.0

    Gumbel-Softmax provides a continuous relaxation of categorical sampling that anneals to discrete samples for gradient-based optimization.

  2. Geometry-Induced Long-Range Correlations in Recurrent Neural Network Quantum States

    quant-ph 2026-04 conditional novelty 7.0

    Dilated RNN wave functions induce power-law correlations for the critical 1D transverse-field Ising model and the Cluster state, unlike the exponential decay of conventional RNN ansatze.

  3. Continuity Laws for Sequential Models

    cs.LG 2026-05 unverdicted novelty 6.0

    S4 models exhibit stable time-continuity unlike sensitive S6 models, with task continuity predicting performance and enabling temporal subsampling for better efficiency.