pith. machine review for the scientific record. sign in

arxiv: 1810.06801 · v4 · submitted 2018-10-16 · 💻 cs.LG · stat.ML

Recognition: unknown

Quasi-hyperbolic momentum and Adam for deep learning

Authors on Pith no claims yet
classification 💻 cs.LG stat.ML
keywords momentumalgorithmsadamdeeplearningproposeqhadamquasi-hyperbolic
0
0 comments X
read the original abstract

Momentum-based acceleration of stochastic gradient descent (SGD) is widely used in deep learning. We propose the quasi-hyperbolic momentum algorithm (QHM) as an extremely simple alteration of momentum SGD, averaging a plain SGD step with a momentum step. We describe numerous connections to and identities with other algorithms, and we characterize the set of two-state optimization algorithms that QHM can recover. Finally, we propose a QH variant of Adam called QHAdam, and we empirically demonstrate that our algorithms lead to significantly improved training in a variety of settings, including a new state-of-the-art result on WMT16 EN-DE. We hope that these empirical results, combined with the conceptual and practical simplicity of QHM and QHAdam, will spur interest from both practitioners and researchers. Code is immediately available.

This paper has not been read by Pith yet.

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Forward citations

Cited by 1 Pith paper

Reviewed papers in the Pith corpus that reference this work. Sorted by Pith novelty score.

  1. Momentum Further Constrains Sharpness at the Edge of Stochastic Stability

    cs.LG 2026-04 unverdicted novelty 7.0

    Momentum SGD exhibits two distinct EoSS regimes for batch sharpness, stabilizing at 2(1-β)/η for small batches and 2(1+β)/η for large batches, aligning with linear stability thresholds.