Estimate Sequences for Stochastic Composite Optimization: Variance Reduction, Acceleration, and Robustness to Noise - Inria - Institut national de recherche en sciences et technologies du numérique
Pré-Publication, Document De Travail Année : 2020

Estimate Sequences for Stochastic Composite Optimization: Variance Reduction, Acceleration, and Robustness to Noise

Résumé

In this paper, we propose a unified view of gradient-based algorithms for stochastic convex composite optimization by extending the concept of estimate sequence introduced by Nesterov. More precisely, we interpret a large class of stochastic optimization methods as procedures that iteratively minimize a surrogate of the objective, which covers the stochastic gradient descent method and variants of the incremental approaches SAGA, SVRG, and MISO/Finito/SDCA. This point of view has several advantages: (i) we provide a simple generic proof of convergence for all of the aforementioned methods; (ii) we naturally obtain new algorithms with the same guarantees; (iii) we derive generic strategies to make these algorithms robust to stochastic noise, which is useful when data is corrupted by small random perturbations. Finally, we propose a new accelerated stochastic gradient descent algorithm and an accelerated SVRG algorithm with optimal complexity that is robust to stochastic noise.
Fichier principal
Vignette du fichier
lower_bound.pdf (815.5 Ko) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-01993531 , version 1 (24-01-2019)
hal-01993531 , version 2 (21-04-2020)
hal-01993531 , version 3 (27-04-2020)
hal-01993531 , version 4 (04-09-2020)

Identifiants

Citer

Andrei Kulunchakov, Julien Mairal. Estimate Sequences for Stochastic Composite Optimization: Variance Reduction, Acceleration, and Robustness to Noise. 2020. ⟨hal-01993531v3⟩

Collections

LJK-GI-CVGI
485 Consultations
336 Téléchargements

Altmetric

Partager

More