Mixed batches and symmetric discriminators for GAN training - Inria - Institut national de recherche en sciences et technologies du numérique
Communication Dans Un Congrès Année : 2018

Mixed batches and symmetric discriminators for GAN training

Résumé

Generative adversarial networks (GANs) are powerful generative models based on providing feedback to a generative network via a discriminator network. However, the discriminator usually assesses individual samples. This prevents the dis-criminator from accessing global distributional statistics of generated samples, and often leads to mode dropping: the generator models only part of the target distribution. We propose to feed the discriminator with mixed batches of true and fake samples, and train it to predict the ratio of true samples in the batch. The latter score does not depend on the order of samples in a batch. Rather than learning this invariance, we introduce a generic permutation-invariant discriminator architecture. This architecture is provably a universal approximator of all symmetric functions. Experimentally, our approach reduces mode collapse in GANs on two synthetic datasets, and obtains good results on the CIFAR10 and CelebA datasets, both qualitatively and quantitatively.
Fichier principal
Vignette du fichier
mixed_batches_sym_disc_for_gans_with_ack (1).pdf (4.18 Mo) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-01791126 , version 1 (19-06-2018)
hal-01791126 , version 2 (05-07-2018)

Identifiants

  • HAL Id : hal-01791126 , version 2

Citer

Thomas Lucas, Corentin Tallec, Jakob Verbeek, Yann Ollivier. Mixed batches and symmetric discriminators for GAN training. ICML - 35th International Conference on Machine Learning, Jul 2018, Stockholm, Sweden. pp.2844-2853. ⟨hal-01791126v2⟩
838 Consultations
524 Téléchargements

Partager

More