Adaptive first-order methods revisited: Convex optimization without Lipschitz requirements - Inria - Institut national de recherche en sciences et technologies du numérique Access content directly
Conference Papers Year : 2021

Adaptive first-order methods revisited: Convex optimization without Lipschitz requirements

Abstract

We propose a new family of adaptive first-order methods for a class of convex minimization problems that may fail to be Lipschitz continuous or smooth in the standard sense. Specifically, motivated by a recent flurry of activity on non-Lipschitz (NoLips) optimization, we consider problems that are continuous or smooth relative to a reference Bregman function-as opposed to a global, ambient norm (Euclidean or otherwise). These conditions encompass a wide range of problems with singular objective, such as Fisher markets, Poisson tomography, D-design, and the like. In this setting, the application of existing order-optimal adaptive methods-like UnixGrad or AcceleGrad-is not possible, especially in the presence of randomness and uncertainty. The proposed method, adaptive mirror descent (AdaMir), aims to close this gap by concurrently achieving min-max optimal rates in problems that are relatively continuous or smooth, including stochastic ones.
Fichier principal
Vignette du fichier
Adamir.pdf (953.93 Ko) Télécharger le fichier
Origin Files produced by the author(s)

Dates and versions

hal-03342998 , version 1 (13-09-2021)

Identifiers

  • HAL Id : hal-03342998 , version 1

Cite

Kimon Antonakopoulos, Panayotis Mertikopoulos. Adaptive first-order methods revisited: Convex optimization without Lipschitz requirements. NeurIPS 2021 - 35th International Conference on Neural Information Processing Systems, Dec 2021, Virtual, Unknown Region. ⟨hal-03342998⟩
139 View
173 Download

Share

Gmail Mastodon Facebook X LinkedIn More