Negative Dependence Tightens Variational Bounds
Abstract
Importance weighted variational inference (IWVI) is a promising strategy for learning latent variable models. IWVI uses new variational bounds, known as Monte Carlo objectives (MCOs), obtained by replacing intractable integrals by Monte Carlo estimates-usually simply obtained via importance sampling. Burda et al. (2016) showed that increasing the number of importance samples provably tightens the gap between the bound and the likelihood. We show that, in a somewhat similar fashion, increasing the negative dependence of importance weights monotonically increases the bound. To this end, we use the supermodular order as a measure of dependence. Our simple result provides theoretical support to several different approaches that leveraged negative dependence to perform efficient variational inference of deep generative models.
Origin : Files produced by the author(s)