pith. machine review for the scientific record. sign in

arxiv: 1905.02957 · v1 · submitted 2019-05-08 · 💻 cs.LG · stat.ML

Recognition: unknown

SAdam: A Variant of Adam for Strongly Convex Functions

Authors on Pith no claims yet
classification 💻 cs.LG stat.ML
keywords convexfunctionsstronglyadamboundconvexityregretsadam
0
0 comments X
read the original abstract

The Adam algorithm has become extremely popular for large-scale machine learning. Under convexity condition, it has been proved to enjoy a data-dependant $O(\sqrt{T})$ regret bound where $T$ is the time horizon. However, whether strong convexity can be utilized to further improve the performance remains an open problem. In this paper, we give an affirmative answer by developing a variant of Adam (referred to as SAdam) which achieves a data-dependant $O(\log T)$ regret bound for strongly convex functions. The essential idea is to maintain a faster decaying yet under controlled step size for exploiting strong convexity. In addition, under a special configuration of hyperparameters, our SAdam reduces to SC-RMSprop, a recently proposed variant of RMSprop for strongly convex functions, for which we provide the first data-dependent logarithmic regret bound. Empirical results on optimizing strongly convex functions and training deep networks demonstrate the effectiveness of our method.

This paper has not been read by Pith yet.

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.