Parameter-free projected gradient descent - Inria - Institut national de recherche en sciences et technologies du numérique Accéder directement au contenu
Pré-Publication, Document De Travail Année : 2023

Parameter-free projected gradient descent

Résumé

We consider the problem of minimizing a convex function over a closed convex set, with Projected Gradient Descent (PGD). We propose a fully parameter-free version of AdaGrad, which is adaptive to the distance between the initialization and the optimum, and to the sum of the square norm of the subgradients. Our algorithm is able to handle projection steps, does not involve restarts, reweighing along the trajectory or additional gradient evaluations compared to the classical PGD. It also fulfills optimal rates of convergence for cumulative regret up to logarithmic factors. We provide an extension of our approach to stochastic optimization and conduct numerical experiments supporting the developed theory.
Fichier principal
Vignette du fichier
CGS-FreeAdaGrad.pdf (875.03 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04105636 , version 1 (30-05-2023)

Identifiants

Citer

Evgenii Chzhen, Christophe Giraud, Gilles Stoltz. Parameter-free projected gradient descent. 2023. ⟨hal-04105636⟩
169 Consultations
45 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More