bias and generalization in deep generative models
play

Bias and Generalization in Deep Generative Models Shengjia Zhao*, - PowerPoint PPT Presentation

Bias and Generalization in Deep Generative Models Shengjia Zhao*, Hongyu Ren*, Arianna Yuan, Jiaming Song, Noah Goodman and Stefano Ermon *equal contribution Success in Generative Modeling of Images Brock A, et al. "Large scale gan


  1. Bias and Generalization in Deep Generative Models Shengjia Zhao*, Hongyu Ren*, Arianna Yuan, Jiaming Song, Noah Goodman and Stefano Ermon *equal contribution

  2. Success in Generative Modeling of Images Brock A, et al. "Large scale gan training for high fidelity natural image synthesis."

  3. Goal: Understanding Generalization How do generative models generalize?

  4. Generalization Example: Object Count

  5. Empirical Study of Generalization: Method • Design datasets • Train generative models (VAE, GAN, PixelCNN) • Observe generalization behavior • Find common patterns

  6. Generalization Example: Object Count

  7. Generalization in Feature Space: Numerosity Generates a log-normal Frequency Frequency shaped distribution 2 1 2 3 4 # Objects # Objects Training Distribution Generated Distribution (Observed)

  8. Multiple Numerosities Frequency ? 2 7 # Objects Training Distribution

  9. Multiple Numerosities: Only 2 Frequency Frequency 2 7 # Objects 1 2 3 4 # Objects Training Distribution Generated Distribution

  10. Multiple Numerosities: Only 7 Frequency Frequency 2 7 # Objects 6 7 8 9 # Objects Training Distribution Generated Distribution

  11. Multiple Numerosities: Additive Hypothesis Frequency Frequency 2 7 # Objects 1 2 3 4 6 7 8 9 # Objects Training Distribution Generated Distribution (Observed)

  12. Additive Hypothesis with 2 and 4 Objects Frequency Frequency 2 4 1 2 3 4 5 6 # Objects # Objects Training Distribution Generated Distribution ( Hypothesized )

  13. Actual Result: Prototype Enhancement 3 objects most likely, even though no training Frequency Frequency image contains 3 objects! 2 4 1 2 3 4 5 6 # Objects # Objects Training Distribution Generated Distribution ( Observed )

  14. Prototype Enhancement Similar pattern for other features: Frequency color , size , location Frequency 2 4 1 2 3 4 5 6 # Objects # Objects Training Distribution Generated Distribution ( Observed )

  15. Multiple Features

  16. Memorization vs. Generalization

  17. Memorization vs. Generalization

  18. Different Setups, Similar Results - Different features (shape, color, size, numerosity, etc.) - Different models: (VAE, GAN, PixelCNN, etc.) - Different architectures (fully connected, convolutional, etc.) - Different hyper-parameters (network size, learning rate, etc.)

  19. Conclusion • New methodology: design datasets to probe generative models • Observed common patterns across different setups Welcome to our poster session for further discussions! Tuesday 5-7pm @ Room 210 & 230 AB #6 Code available at github.com/ermongroup/BiasAndGeneralization

Recommend


More recommend