The limited-memory recursive variational Gaussian approximation (L-RVGA) - Inria - Institut national de recherche en sciences et technologies du numérique
Article Dans Une Revue Statistics and Computing Année : 2023

The limited-memory recursive variational Gaussian approximation (L-RVGA)

Résumé

We consider the problem of computing a Gaussian approximation to the posterior distribution of a parameter given a large number N of observations and a Gaussian prior, when the dimension of the parameter d is also large. To address this problem we build on a recently introduced recursive algorithm for variational Gaussian approximation of the posterior, called recursive variational Gaussian approximation (RVGA), which is a single pass algorithm, free of parameter tuning. In this paper, we consider the case where the parameter dimension d is high, and we propose a novel version of RVGA that scales linearly in the dimension d (as well as in the number of observations N), and which only requires linear storage capacity in d. This is afforded by the use of a novel recursive expectation maximization (EM) algorithm applied for factor analysis introduced herein, to approximate at each step the covariance matrix of the Gaussian distribution conveying the uncertainty in the parameter. The approach is successfully illustrated on the problems of high dimensional least-squares and logistic regression, and generalized to a large class of nonlinear models.
Fichier principal
Vignette du fichier
L-RVGA-HAL-vF.pdf (931.93 Ko) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03501920 , version 1 (23-12-2021)
hal-03501920 , version 2 (05-11-2022)
hal-03501920 , version 3 (22-03-2023)

Identifiants

Citer

Marc Lambert, Silvère Bonnabel, Francis Bach. The limited-memory recursive variational Gaussian approximation (L-RVGA). Statistics and Computing, 2023, 33 (70), ⟨10.1007/s11222-023-10239-x⟩. ⟨hal-03501920v3⟩
450 Consultations
503 Téléchargements

Altmetric

Partager

More