entropic gans meet vaes a statistical approach to compute
play

Entropic GANs meet VAEs: A Statistical Approach to Compute Sample - PowerPoint PPT Presentation

Entropic GANs meet VAEs: A Statistical Approach to Compute Sample Likelihoods in GANs Yogesh Balaji with Hamed Hassani, Rama Chellappa and Soheil Feizi Generative Adversarial Networks GANs are very successful at generating samples from a data


  1. Entropic GANs meet VAEs: A Statistical Approach to Compute Sample Likelihoods in GANs Yogesh Balaji with Hamed Hassani, Rama Chellappa and Soheil Feizi

  2. Generative Adversarial Networks GANs are very successful at generating samples from a data distribution StyleGAN (Karras et al., 2018) BigGAN (Brock et al., 2018)

  3. Generative models Modern Approach: 
 Classical Approach: 
 Generative Adversarial 
 Fitting an explicit model Networks (GANs) 
 using maximum likelihood Lacks an explicit density model

  4. Generative models Modern Approach: 
 Classical Approach: 
 Generative Adversarial 
 Fitting an explicit model Networks (GANs) 
 using maximum likelihood Lacks an explicit density model

  5. Generative models Our 
 Modern Approach: 
 Classical Approach: 
 contribution Generative Adversarial 
 Fitting an explicit model Networks (GANs) 
 using maximum likelihood Lacks an explicit density model

  6. ̂ Entropic GANs Entropic GANs are Wasserstein GANs with entropy regularization Primal E [ l ( Y , G ( X )) ] − H ( P Y , G ( X ) ) min Notation P Y , G ( X ) Real data random variable Y Dual Noise random variable X E [ D 1 ( Y ) ] − E [ D 2 ( G ( X )) ] − λ E P Y × P ̂ Y [ exp v ( y , ̂ y )/ λ ] Generator function G : ℝ r → ℝ d min G max D 1 , D 2 Y := G ( X ) v ( y , ̂ y ) := D 1 ( y ) − D 2 ( ̂ y ) − l ( y , ̂ y ) where Loss function between two samples

  7. An Explicit data model for Entropic GAN We construct an explicit probability model for data distribution using GANs f Y | X = x = C exp( − l ( y , G ( x ))/ λ ) Normalization 
 Loss function used in constant Entropic GANs

  8. Main theorem E P Y [log f Y ( Y )] ≥ − 1 Y [ l ( Y , ̂ λ { E P Y , ̂ Y )] − λ H ( P Y , ̂ Y )} + constants Entropic GAN Avg. log likelihoods objective Entropic GAN objective is a variational lower-bound of log likelihood Similar to the evidence lower-bound in Variational Auto-Encoders

  9. Main theorem E P Y [log f Y ( Y )] ≥ − 1 Y [ l ( Y , ̂ λ { E P Y , ̂ Y )] − λ H ( P Y , ̂ Y )} + constants Entropic GAN Avg. log likelihoods objective Entropic GANs meet VAEs

  10. Is this bound useful? • Provides a statistical interpretation to Entropic GAN's objective function • Useful in computing sample likelihoods of test samples

  11. Components of our surrogate likelihood

  12. Likelihood computation Given a dataset of MNIST-1 digits as source distribution, estimate the likelihood that MNIST and SVHN datasets are drawn from this distribution Dissimilar datasets have low likelihood

  13. Tightness of the lower-bound We consider Gaussian input data distribution to compute the tightness of our variational lower-bound Exact Surrogate Data dimension Log-Likelihood Log-Likelihood 5 -16.38 -17.94 10 -35.15 -43.60 20 -58.04 -66.58 30 -91.80 -100.69 64 -203.46 -217.52

  14. Conclusion Establish a connection between Please stop by Poster# 17 GANs and VAEs by deriving a variational lower-bound for GANs Code available at https:// github.com/yogeshbalaji/ Provide a principled framework EntropicGANs_meet_VAEs for computing sample likelihoods using GANs

Recommend


More recommend