Quasi-Symplectic Langevin Variational Autoencoder
Résumé
Variational autoencoder (VAE) is a very popular and well-investigated generative model vastly used in neural learning research. To leverage VAE in practical tasks dealing with a massive dataset of large dimensions it is required to deal with the difficulty of building low variance evidence lower bounds (ELBO). Markov Chain Monte Carlo (MCMC) is one of the effective approaches to tighten the ELBO for approximating the posterior distribution. Hamiltonian Variational Autoencoder (HVAE) is an effective MCMC inspired approach for constructing a the lowvariance ELBO which is also amenable to the reparameterization trick. In this work, we propose a Quasi-symplectic Langevin Variational autoencoder (LangevinVAE) by incorporating the gradients information in the inference process through Langevin dynamic. We shows the effectiveness of the proposed approach by toy and real world examples.
Origine | Fichiers produits par l'(les) auteur(s) |
---|