Linear Convergence of Comparison-based Step-size Adaptive Randomized Search via Stability of Markov Chains
Résumé
In this paper, we consider \emph{comparison-based} adaptive stochastic algorithms for solving numerical optimisation problems. We consider a specific subclass of algorithms called \cprs (CB-SARS), where the state variables at a given iteration are a vector of the search space and a positive parameter, the step-size, typically controlling the overall standard deviation of the underlying search distribution.
We investigate the \emph{linear} convergence of CB-SARS on \emph{scaling-invariant} objective functions. Scaling-invariant functions preserve the ordering of points with respect to their function value when the points are scaled with the same positive parameter (the scaling is done w.r.t.\ a fixed reference point). This class of functions includes norms composed with strictly increasing functions as well as many \emph{non quasi-convex} and \emph{non-continuous} functions. On scaling-invariant functions, we show the existence of a homogeneous Markov chain, as a consequence of natural invariance properties of CB-SARS (essentially scale-invariance and invariance to strictly increasing transformation of the objective function).
We then derive sufficient conditions for \emph{global linear convergence} of CB-SARS, expressed in terms of different stability conditions of the normalised homogeneous Markov chain (irreducibility, positivity, Harris recurrence, geometric ergodicity) and thus define a general methodology for proving global linear convergence of CB-SARS algorithms on scaling-invariant functions. As a by-product we provide a connexion between \emph{comparison-based} adaptive stochastic algorithms and Markov chain Monte Carlo algorithms.
Origine | Fichiers produits par l'(les) auteur(s) |
---|