On the convergence of gradient-like flows with noisy gradient input
Résumé
In view of solving convex optimization problems with noisy gradient input, we analyze the asymptotic behavior of gradient-like flows that are subject to stochastic disturbances. Specifically, we focus on the widely studied class of mirror descent methods for constrained convex programming and we examine the dynamics' convergence and concentration properties in the presence of noise. In the small noise limit, we show that the dynamics converge to the solution set of the underlying problem (a.s.). Otherwise, if the noise is persistent, we estimate the measure of the dynamics' long-run concentration around interior solutions and their convergence to boundary solutions that are sufficiently "robust". Finally, we show that a suitably rectified variant of the method converges irrespective of the magnitude of the noise (or the structure of the underlying convex program), and we derive an explicit estimate for its rate of convergence.
Fichier principal
Main.pdf (970.36 Ko)
Télécharger le fichier
StochasticMirrorDescent.pdf (1.08 Mo)
Télécharger le fichier
Origine | Fichiers produits par l'(les) auteur(s) |
---|
Loading...