A new regret analysis for Adam-type algorithms
Abstract
In this paper, we focus on a theory-practice gap for Adam and its variants (AMSgrad, AdamNC, etc.). In practice, these algorithms are used with a constant first-order moment parameter β 1 (typically between 0.9 and 0.99). In theory, regret guarantees for online convex optimization require a rapidly decaying β 1 → 0 schedule. We show that this is an artifact of the standard analysis, and we propose a novel framework that allows us to derive optimal, data-dependent regret bounds with a constant β 1 , without further assumptions. We also demonstrate the flexibility of our analysis on a wide range of different algorithms and settings.
Fichier principal
ICML-2020-a-new-regret-analysis-for-adam-type-algorithms-Paper.pdf (259.28 Ko)
Télécharger le fichier
Origin | Files produced by the author(s) |
---|