pith. machine review for the scientific record. sign in

arxiv: 1806.02694 · v1 · submitted 2018-06-07 · 🧮 math.OC

Recognition: unknown

Gradient Method for Optimization on Riemannian Manifolds with Lower Bounded Curvature

Authors on Pith no claims yet
classification 🧮 math.OC
keywords stepsizegradientmethodlipschitzriemannianadaptiveboundedcontinuous
0
0 comments X
read the original abstract

The gradient method for minimize a differentiable convex function on Riemannian manifolds with lower bounded sectional curvature is analyzed in this paper. The analysis of the method is presented with three different finite procedures for determining the stepsize, namely, Lipschitz stepsize, adaptive stepsize and Armijo's stepsize. The first procedure requires that the objective function has Lipschitz continuous gradient, which is not necessary for the other approaches. Convergence of the whole sequence to a minimizer, without any level set boundedness assumption, is proved. Iteration-complexity bound for functions with Lipschitz continuous gradient is also presented. Numerical experiments are provided to illustrate the effectiveness of the method in this new setting and certify the obtained theoretical results. In particular, we consider the problem of finding the Riemannian center of mass and the so-called Karcher's mean. Our numerical experiences indicate that the adaptive stepsize is a promising scheme that is worth considering.

This paper has not been read by Pith yet.

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.