pith. machine review for the scientific record. sign in

arxiv: 1605.00125 · v6 · submitted 2016-04-30 · 🧮 math.OC

Recognition: unknown

Efficiency of minimizing compositions of convex functions and smooth maps

Authors on Pith no claims yet
classification 🧮 math.OC
keywords methodsmoothvarepsilonconvexefficiencymathcalminimizingprox-linear
0
0 comments X
read the original abstract

We consider global efficiency of algorithms for minimizing a sum of a convex function and a composition of a Lipschitz convex function with a smooth map. The basic algorithm we rely on is the prox-linear method, which in each iteration solves a regularized subproblem formed by linearizing the smooth map. When the subproblems are solved exactly, the method has efficiency $\mathcal{O}(\varepsilon^{-2})$, akin to gradient descent for smooth minimization. We show that when the subproblems can only be solved by first-order methods, a simple combination of smoothing, the prox-linear method, and a fast-gradient scheme yields an algorithm with complexity $\widetilde{\mathcal{O}}(\varepsilon^{-3})$. The technique readily extends to minimizing an average of $m$ composite functions, with complexity $\widetilde{\mathcal{O}}(m/\varepsilon^{2}+\sqrt{m}/\varepsilon^{3})$ in expectation. We round off the paper with an inertial prox-linear method that automatically accelerates in presence of convexity.

This paper has not been read by Pith yet.

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.