pith. machine review for the scientific record. sign in

arxiv: 1209.1873 · v2 · submitted 2012-09-10 · 📊 stat.ML · cs.LG· math.OC

Recognition: unknown

Stochastic Dual Coordinate Ascent Methods for Regularized Loss Minimization

Authors on Pith no claims yet
classification 📊 stat.ML cs.LGmath.OC
keywords analysisascentcoordinatedualstochasticguaranteesmethodssdca
0
0 comments X
read the original abstract

Stochastic Gradient Descent (SGD) has become popular for solving large scale supervised machine learning optimization problems such as SVM, due to their strong theoretical guarantees. While the closely related Dual Coordinate Ascent (DCA) method has been implemented in various software packages, it has so far lacked good convergence analysis. This paper presents a new analysis of Stochastic Dual Coordinate Ascent (SDCA) showing that this class of methods enjoy strong theoretical guarantees that are comparable or better than SGD. This analysis justifies the effectiveness of SDCA for practical applications.

This paper has not been read by Pith yet.

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Forward citations

Cited by 1 Pith paper

Reviewed papers in the Pith corpus that reference this work. Sorted by Pith novelty score.

  1. Variance Matters: Improving Domain Adaptation via Stratified Sampling

    cs.LG 2025-12 unverdicted novelty 6.0

    VaRDASS improves unsupervised domain adaptation by using stratified sampling to reduce variance in discrepancy estimation for measures like correlation alignment and MMD, with derived error bounds, an optimality proof...