The authors introduce (ηx,ηy,δ,ε)-GSSP as a convergence criterion and develop projected gradient-free descent-ascent methods achieving non-asymptotic rates for nonsmooth nonconvex-concave minimax optimization without weak convexity assumptions.
Single-timescale stochastic nonconvex-concave optimization for smooth nonlinear TD learning.arXiv preprint arXiv:2008.10103
1 Pith paper cite this work. Polarity classification is still indexing.
1
Pith paper citing it
fields
math.OC 1years
2026 1verdicts
UNVERDICTED 1representative citing papers
citing papers explorer
-
Nonsmooth Nonconvex-Concave Minimax Optimization: Convergence Criteria and Algorithms
The authors introduce (ηx,ηy,δ,ε)-GSSP as a convergence criterion and develop projected gradient-free descent-ascent methods achieving non-asymptotic rates for nonsmooth nonconvex-concave minimax optimization without weak convexity assumptions.