Gradient correlation is needed to accelerate SGD with momentum - Inria - Institut national de recherche en sciences et technologies du numérique
Pré-Publication, Document De Travail Année : 2024

Gradient correlation is needed to accelerate SGD with momentum

Résumé

Empirically, it has been observed that adding momentum to Stochastic Gradient Descent (SGD) accelerates the convergence of the algorithm. However, the literature has been rather pessimistic, even in the case of convex functions, about the possibility of theoretically proving this observation. We investigate the possibility of obtaining accelerated convergence of the Stochastic Nesterov Accelerated Gradient (SNAG), a momentum-based version of SGD, when minimizing a sum of functions in a convex setting. We demonstrate that the average correlation between gradients allows to verify the strong growth condition, which is the key ingredient to obtain acceleration with SNAG. Numerical experiments, both in linear regression and deep neural network optimization, confirm in practice our theoretical results.
Fichier principal
Vignette du fichier
racoga_arxiv.pdf (1.95 Mo) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04728121 , version 1 (09-10-2024)

Identifiants

  • HAL Id : hal-04728121 , version 1

Citer

Julien Hermant, Marien Renaud, Jean-François Aujol, Charles Dossal, Aude Rondepierre. Gradient correlation is needed to accelerate SGD with momentum. 2024. ⟨hal-04728121⟩
42 Consultations
8 Téléchargements

Partager

More