Anderson acceleration of coordinate descent - Inria - Institut national de recherche en sciences et technologies du numérique Access content directly
Conference Papers Year : 2021

Anderson acceleration of coordinate descent

Abstract

Acceleration of first order methods is mainly obtained via inertial techniques à la Nesterov, or via nonlinear extrapolation. The latter has known a recent surge of interest, with successful applications to gradient and proximal gradient techniques. On multiple Machine Learning problems, coordinate descent achieves performance significantly superior to full-gradient methods. Speeding up coordinate descent in practice is not easy: inertially accelerated versions of coordinate descent are theoretically accelerated, but might not always lead to practical speed-ups. We propose an accelerated version of coordinate descent using extrapolation, showing considerable speed up in practice, compared to inertial accelerated coordinate descent and extrapolated (proximal) gradient descent. Experiments on least squares, Lasso, elastic net and logistic regression validate the approach.
Fichier principal
Vignette du fichier
extrapolcd.pdf (2.43 Mo) Télécharger le fichier
Origin : Files produced by the author(s)

Dates and versions

hal-03019684 , version 1 (23-11-2020)

Identifiers

  • HAL Id : hal-03019684 , version 1

Cite

Quentin Bertrand, Mathurin Massias. Anderson acceleration of coordinate descent. AISTATS 2021 - 24th International Conference on Artificial Intelligence and Statistics, Apr 2021, San Diego / Virtual, United States. ⟨hal-03019684⟩
81 View
247 Download

Share

Gmail Facebook X LinkedIn More