Negative Dependence Tightens Variational Bounds - Inria - Institut national de recherche en sciences et technologies du numérique Access content directly
Conference Papers Year : 2020

Negative Dependence Tightens Variational Bounds

Abstract

Importance weighted variational inference (IWVI) is a promising strategy for learning latent variable models. IWVI uses new variational bounds, known as Monte Carlo objectives (MCOs), obtained by replacing intractable integrals by Monte Carlo estimates-usually simply obtained via importance sampling. Burda et al. (2016) showed that increasing the number of importance samples provably tightens the gap between the bound and the likelihood. We show that, in a somewhat similar fashion, increasing the negative dependence of importance weights monotonically increases the bound. To this end, we use the supermodular order as a measure of dependence. Our simple result provides theoretical support to several different approaches that leveraged negative dependence to perform efficient variational inference of deep generative models.
Fichier principal
Vignette du fichier
mono_icml_workshop (1).pdf (313.77 Ko) Télécharger le fichier
Origin : Files produced by the author(s)

Dates and versions

hal-03044115 , version 1 (07-12-2020)

Identifiers

  • HAL Id : hal-03044115 , version 1

Cite

Pierre-Alexandre Mattei, Jes Frellsen. Negative Dependence Tightens Variational Bounds. ICML 2020 - 2nd Workshop on Negative Dependence and Submodularity for ML, Jul 2020, Vienne / Online, Austria. ⟨hal-03044115⟩
85 View
68 Download

Share

Gmail Facebook X LinkedIn More