Dimension-free convergence rates for gradient Langevin dynamics in RKHS - Inria - Institut national de recherche en sciences et technologies du numérique
Conference Papers Year : 2022

Dimension-free convergence rates for gradient Langevin dynamics in RKHS

Abstract

Gradient Langevin dynamics (GLD) and stochastic GLD (SGLD) have attracted considerable attention lately, as a way to provide convergence guarantees in a non-convex setting. However, the known rates grow exponentially with the dimension of the space under the dissipative condition. In this work, we provide a convergence analysis of GLD and SGLD when the optimization space is an infinite-dimensional Hilbert space. More precisely, we derive non-asymptotic, dimensionfree convergence rates for GLD/SGLD when performing regularized non-convex optimization in a reproducing kernel Hilbert space. Amongst others, the convergence analysis relies on the properties of a stochastic differential equation, its discrete time Galerkin approximation and the geometric ergodicity of the associated Markov chains.
Fichier principal
Vignette du fichier
muzellec22a.pdf (589.54 Ko) Télécharger le fichier
Origin Files produced by the author(s)

Dates and versions

hal-03920387 , version 1 (03-01-2023)

Identifiers

  • HAL Id : hal-03920387 , version 1

Cite

Boris Muzellec, Kanji Sato, Mathurin Massias, Taiji Suzuki. Dimension-free convergence rates for gradient Langevin dynamics in RKHS. COLT 2022 - 35th Annual Conference on Learning Theory, Jul 2022, London, United Kingdom. ⟨hal-03920387⟩
37 View
82 Download

Share

More