pith. machine review for the scientific record. sign in

arxiv: 1005.2146 · v1 · submitted 2010-05-12 · 💻 cs.LG · cs.NA

Recognition: unknown

On the Finite Time Convergence of Cyclic Coordinate Descent Methods

Authors on Pith no claims yet
classification 💻 cs.LG cs.NA
keywords descentconvergencecoordinatecyclictimefinitegradientmethods
0
0 comments X
read the original abstract

Cyclic coordinate descent is a classic optimization method that has witnessed a resurgence of interest in machine learning. Reasons for this include its simplicity, speed and stability, as well as its competitive performance on $\ell_1$ regularized smooth optimization problems. Surprisingly, very little is known about its finite time convergence behavior on these problems. Most existing results either just prove convergence or provide asymptotic rates. We fill this gap in the literature by proving $O(1/k)$ convergence rates (where $k$ is the iteration counter) for two variants of cyclic coordinate descent under an isotonicity assumption. Our analysis proceeds by comparing the objective values attained by the two variants with each other, as well as with the gradient descent algorithm. We show that the iterates generated by the cyclic coordinate descent methods remain better than those of gradient descent uniformly over time.

This paper has not been read by Pith yet.

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Forward citations

Cited by 1 Pith paper

Reviewed papers in the Pith corpus that reference this work. Sorted by Pith novelty score.

  1. Turning Stale Gradients into Stable Gradients: Coherent Coordinate Descent with Implicit Landscape Smoothing for Lightweight Zeroth-Order Optimization

    cs.LG 2026-05 unverdicted novelty 7.0

    CoCD converts stale gradients into stable descent directions for zeroth-order optimization through coherent coordinate updates and implicit landscape smoothing from larger finite-difference steps.