pith. machine review for the scientific record. sign in

arxiv: 1506.06840 · v2 · submitted 2015-06-23 · 💻 cs.LG · stat.ML

Recognition: unknown

On Variance Reduction in Stochastic Gradient Descent and its Asynchronous Variants

Authors on Pith no claims yet
classification 💻 cs.LG stat.ML
keywords asynchronousalgorithmsreductionvariancebeensvrgdescentframework
0
0 comments X
read the original abstract

We study optimization algorithms based on variance reduction for stochastic gradient descent (SGD). Remarkable recent progress has been made in this direction through development of algorithms like SAG, SVRG, SAGA. These algorithms have been shown to outperform SGD, both theoretically and empirically. However, asynchronous versions of these algorithms---a crucial requirement for modern large-scale applications---have not been studied. We bridge this gap by presenting a unifying framework for many variance reduction techniques. Subsequently, we propose an asynchronous algorithm grounded in our framework, and prove its fast convergence. An important consequence of our general approach is that it yields asynchronous versions of variance reduction algorithms such as SVRG and SAGA as a byproduct. Our method achieves near linear speedup in sparse settings common to machine learning. We demonstrate the empirical performance of our method through a concrete realization of asynchronous SVRG.

This paper has not been read by Pith yet.

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.