R-SGD-Mini achieves O(1/T) convergence of expected squared gradient norm to a noise-dependent neighborhood in heavy-tailed settings by selecting the medoid gradient from M data chunks.
Breaking the heavy-tailed noise barrier in stochastic optimization problems
1 Pith paper cite this work. Polarity classification is still indexing.
1
Pith paper citing it
fields
math.OC 1years
2026 1verdicts
UNVERDICTED 1representative citing papers
citing papers explorer
-
Robust stochastic first order methods in heavy-tailed noise via medoid mini-batch gradient sampling
R-SGD-Mini achieves O(1/T) convergence of expected squared gradient norm to a noise-dependent neighborhood in heavy-tailed settings by selecting the medoid gradient from M data chunks.