pith. machine review for the scientific record. sign in

arxiv: 1410.0390 · v2 · submitted 2014-10-01 · 🧮 math.OC · cs.CC

Recognition: unknown

Simple Complexity Analysis of Simplified Direct Search

Authors on Pith no claims yet
classification 🧮 math.OC cs.CC
keywords directepsilonsearchconvexfunctionproblemcomplexitynumber
0
0 comments X
read the original abstract

We consider the problem of unconstrained minimization of a smooth function in the derivative-free setting using. In particular, we propose and study a simplified variant of the direct search method (of direction type), which we call simplified direct search (SDS). Unlike standard direct search methods, which depend on a large number of parameters that need to be tuned, SDS depends on a single scalar parameter only. Despite relevant research activity in direct search methods spanning several decades, complexity guarantees---bounds on the number of function evaluations needed to find an approximate solution---were not established until very recently. In this paper we give a surprisingly brief and unified analysis of SDS for nonconvex, convex and strongly convex functions. We match the existing complexity results for direct search in their dependence on the problem dimension ($n$) and error tolerance ($\epsilon$), but the overall bounds are simpler, easier to interpret, and have better dependence on other problem parameters. In particular, we show that for the set of directions formed by the standard coordinate vectors and their negatives, the number of function evaluations needed to find an $\epsilon$-solution is $O(n^2 /\epsilon)$ (resp. $O(n^2 \log(1/\epsilon))$) for the problem of minimizing a convex (resp. strongly convex) smooth function. In the nonconvex smooth case, the bound is $O(n^2/\epsilon^2)$, with the goal being the reduction of the norm of the gradient below $\epsilon$.

This paper has not been read by Pith yet.

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Forward citations

Cited by 1 Pith paper

Reviewed papers in the Pith corpus that reference this work. Sorted by Pith novelty score.

  1. Rescaled Asynchronous SGD: Optimal Distributed Optimization under Data and System Heterogeneity

    cs.LG 2026-05 unverdicted novelty 6.0

    Rescaled ASGD recovers convergence to the true global objective by rescaling worker stepsizes proportional to computation times, matching the known time lower bound in the leading term under non-convex smoothness and ...