pith. machine review for the scientific record. sign in

arxiv: 1605.07154 · v1 · submitted 2016-05-23 · 💻 cs.LG · cs.NE

Recognition: unknown

Path-Normalized Optimization of Recurrent Neural Networks with ReLU Activations

Authors on Pith no claims yet
classification 💻 cs.LG cs.NE
keywords rnnsreluactivationsgeometrynetworksneuraloptimizationpath-sgd
0
0 comments X
read the original abstract

We investigate the parameter-space geometry of recurrent neural networks (RNNs), and develop an adaptation of path-SGD optimization method, attuned to this geometry, that can learn plain RNNs with ReLU activations. On several datasets that require capturing long-term dependency structure, we show that path-SGD can significantly improve trainability of ReLU RNNs compared to RNNs trained with SGD, even with various recently suggested initialization schemes.

This paper has not been read by Pith yet.

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.