generative adversarial networks gans
play

Generative Adversarial Networks (GANs) Prof. Seungchul Lee - PowerPoint PPT Presentation

Generative Adversarial Networks (GANs) Prof. Seungchul Lee Industrial AI Lab. Source 1 GAN (Generative Adversarial Network) by YouTube: https://www.youtube.com/watch?v=odpjk7_tGY0 Slides:


  1. Generative Adversarial Networks (GANs) Prof. Seungchul Lee Industrial AI Lab.

  2. Source • 1 시간만에 GAN (Generative Adversarial Network) 완전 정복하기 – by 최윤제 – YouTube: https://www.youtube.com/watch?v=odpjk7_tGY0 – Slides: https://www.slideshare.net/NaverEngineering/1-gangenerative-adversarial-network • CSC321 Lecture 19: GAN – By Prof. Roger Grosse at Univ. of Toronto – http://www.cs.toronto.edu/~rgrosse/courses/csc321_2018/ • CS231n: CNN for Visual Recognition – Lecture 13: Generative Models – By Prof. Fei-Fei Li at Stanford University – http://cs231n.stanford.edu/ 2

  3. Supervised Learning • Discriminative model https://www.slideshare.net/NaverEngineering/1-gangenerative-adversarial-network 3

  4. Unsupervised Learning • Generative model = Latent space https://www.slideshare.net/NaverEngineering/1-gangenerative-adversarial-network 4

  5. Model Distribution vs. Data Distribution 5

  6. Probability Distribution https://www.slideshare.net/NaverEngineering/1-gangenerative-adversarial-network 6

  7. Probability Distribution https://www.slideshare.net/NaverEngineering/1-gangenerative-adversarial-network 7

  8. Probability Distribution https://www.slideshare.net/NaverEngineering/1-gangenerative-adversarial-network 8

  9. Probability Density Estimation Problem • If 𝑄 𝑛𝑝𝑒𝑓𝑚 (𝑦) can be estimated as close to 𝑄 𝑒𝑏𝑢𝑏 (𝑦) , then data can be generated by sampling from 𝑄 𝑛𝑝𝑒𝑓𝑚 (𝑦) https://www.slideshare.net/NaverEngineering/1-gangenerative-adversarial-network 9

  10. Generative Models from Lower Dimension • Learn transformation via a neural network • Start by sampling the code vector 𝑨 from a fixed, simple distribution (e.g. uniform distribution or Gaussian distribution) • Then this code vector is passed as input to a deterministic generator network 𝐻 , which produces an output sample 𝑦 = 𝐻(𝑨) Latent space = Source: Prof. Roger Grosse at U of Toronto 10

  11. Deterministic Transformation (by Network) • 1-dimensional example: • Remember – Network does not generate distribution, but – It maps known distribution to target distribution Source: Prof. Roger Grosse at U of Toronto 11

  12. Deterministic Transformation (by Network) • High dimensional example: Source: Prof. Roger Grosse at U of Toronto 12

  13. Prob. Density Function by Deep Learning • Generative model of image Source: Prof. Roger Grosse at U of Toronto 13

  14. Generative Adversarial Networks (GANs) • In generative modeling, we'd like to train a network that models a distribution, such as a distribution over images. • GANs do not work with any explicit density function ! – Instead, take game-theoretic approach 14

  15. Turing Test • One way to judge the quality of the model is to sample from it. • GANs are based on a very different idea: – Model to produce samples which are indistinguishable from the real data, as judged by a discriminator network whose job is to tell real from fake 15

  16. Generative Adversarial Networks (GAN) • The idea behind Generative Adversarial Networks (GANs): train two different networks – Generator network: try to produce realistic-looking samples – Discriminator network: try to distinguish between real and fake data • The generator network tries to fool the discriminator network 16

  17. Autoencoder • Dimension reduction • Recover the input data – Learns an encoding of the inputs so as to recover the original input from the encodings as well as possible 17

  18. Generative Adversarial Networks (GAN) • Analogous to Turing Test Data Generator Generated Generator 18

  19. Generative Adversarial Networks (GAN) • Analogous to Turing Test Real Generated Fake Real Generator Discriminator 19

  20. Intuition for GAN https://www.slideshare.net/NaverEngineering/1-gangenerative-adversarial-network 20

  21. Discriminator Perspective (1/2) https://www.slideshare.net/NaverEngineering/1-gangenerative-adversarial-network 21

  22. Discriminator Perspective (2/2) https://www.slideshare.net/NaverEngineering/1-gangenerative-adversarial-network 22

  23. Generator Perspective https://www.slideshare.net/NaverEngineering/1-gangenerative-adversarial-network 23

  24. Loss Function of Discriminator https://www.slideshare.net/NaverEngineering/1-gangenerative-adversarial-network 24

  25. Loss Function of Generator https://www.slideshare.net/NaverEngineering/1-gangenerative-adversarial-network 25

  26. Non-Saturating Game https://www.slideshare.net/NaverEngineering/1-gangenerative-adversarial-network 26

  27. Non-Saturating Game https://www.slideshare.net/NaverEngineering/1-gangenerative-adversarial-network 27

  28. Solving a MinMax Problem https://www.slideshare.net/NaverEngineering/1-gangenerative-adversarial-network 28

  29. GAN Implementation in TensorFlow 29

  30. TensorFlow Implementation 1 256 784 100 256 784 https://www.slideshare.net/NaverEngineering/1-gangenerative-adversarial-network 30

  31. TensorFlow Implementation 1 256 784 100 256 784 https://www.slideshare.net/NaverEngineering/1-gangenerative-adversarial-network 31

  32. TensorFlow Implementation 1 256 784 100 256 784 https://www.slideshare.net/NaverEngineering/1-gangenerative-adversarial-network 32

  33. TensorFlow Implementation https://www.slideshare.net/NaverEngineering/1-gangenerative-adversarial-network 33

  34. TensorFlow Implementation 1 256 784 100 256 784 https://www.slideshare.net/NaverEngineering/1-gangenerative-adversarial-network 34

  35. TensorFlow Implementation 1 256 784 100 256 784 https://www.slideshare.net/NaverEngineering/1-gangenerative-adversarial-network 35

  36. TensorFlow Implementation 1 256 784 100 256 784 https://www.slideshare.net/NaverEngineering/1-gangenerative-adversarial-network 36

  37. After Training • After training, use generator network to generate new data https://www.slideshare.net/NaverEngineering/1-gangenerative-adversarial-network 37

  38. GAN Samples 38

  39. Conditional GAN 39

  40. Conditional GAN • In an unconditioned generative model, there is no control on modes of the data being generated. • In the Conditional GAN (CGAN), the generator learns to generate a fake sample with a specific condition or characteristics (such as a label associated with an image or more detailed tag) rather than a generic sample from unknown noise distribution. 40

  41. Conditional GAN • MNIST digits generated conditioned on their class label 41

  42. Conditional GAN • Simple modification to the original GAN framework that conditions the model on additional information for better multi-modal learning • Many practical applications of GANs when we have explicit supervision available 42

  43. Normal Distribution of MNIST • A standard normal distribution • This is how we would like points corresponding to MNIST digit images to be distributed in the latent space 43

  44. Generator at GAN Generator Generator 44

  45. Generator at Conditional GAN • Feed a random point in latent space and desired number. • Even if the same latent point is used for two different numbers, the process will work correctly since the latent space only encodes features such as stroke width or angle Generator Generator 45

  46. CGAN Implementation 46

  47. CGAN Implementation 47

  48. CGAN Implementation 48

  49. CGAN Implementation • Generate fake MNIST images by CGAN 49

Recommend


More recommend