high fidelity image generation with fewer labels
play

High-Fidelity Image Generation With Fewer Labels Michael - PowerPoint PPT Presentation

High-Fidelity Image Generation With Fewer Labels Michael Tschannen* Mario Lucic* Marvin Rituer* Xiaohua Zhai Olivier Bachem Sylvain Gelly *equal contribution Generative Adversarial Networks (GANs): Recent Progress BigGAN (Brock,


  1. High-Fidelity Image Generation With Fewer Labels Michael Tschannen* Mario Lucic* Marvin Rituer* Xiaohua Zhai Olivier Bachem Sylvain Gelly *equal contribution

  2. Generative Adversarial Networks (GANs): Recent Progress BigGAN (Brock, Donahue, Simonyan 2019) P 3

  3. Generative Adversarial Networks (GANs): Recent Progress BigGAN (Brock, Donahue, Simonyan 2019) class-conditional P 4

  4. Generative Adversarial Networks (GANs): Recent Progress BigGAN (Brock, Donahue, Simonyan 2019) class-conditional Conditioning reduces the diverse generation problem to a per-class problem P 5

  5. Generative Adversarial Networks (GANs): Recent Progress BigGAN (Brock, Donahue, Simonyan 2019) SS-GAN (Chen et al. 2019) class-conditional unsupervised Conditioning reduces the diverse generation problem to a per-class problem P 6

  6. Generative Adversarial Networks (GANs): Recent Progress BigGAN (Brock, Donahue, Simonyan 2019) SS-GAN (Chen et al. 2019) class-conditional unsupervised Conditioning reduces the diverse generation problem to a per-class problem Unsupervised models are considerably less powergul P 7

  7. Generative Adversarial Networks (GANs): Recent Progress BigGAN (Brock, Donahue, Simonyan 2019) SS-GAN (Chen et al. 2019) class-conditional unsupervised This work: How to close the gap between conditional and unsupervised GANs? P 8

  8. Proposed methods: Overview Replace ground-truth labels with synthetic/inferred labels ● ➜ No changes in the GAN architecture required Infer labels for the real data using self-supervised and ● semi-supervised learning techniques P 9

  9. Proposed methods: Pre-training 1. Learn a semantic representation F of the data using self-supervision by rotation prediction (Gidaris et al. 2018) 2. Clustering or semi-supervised learning on the representation F 3. Train GAN with inferred labels P 10

  10. Proposed methods: Co-training Semi-supervised classifjcation head on discriminator ● P 11

  11. Improve pre- and co-training methods Rotation-self supervision during GAN training (Chen et al. 2019) ● P 12

  12. Results BigGAN (100%) Clustering (SS) is unsupervised SOTA (FID 22.0) ● S 2 GAN (20%) and S 3 GAN (10%) match BigGAN (100%) ● S 3 GAN (20%) outpergorms BigGAN (100%) (SOTA) ● P 13

  13. Samples: BigGAN (our implementation) vs proposed BigGAN (100%) S 3 GAN (10%) 256 x 256 px P 14

  14. Results S 3 GAN (10%) 256 x 256 px P 16

  15. Code, pretrained models and Colabs: github.com/google/compare_gan Check out our poster #13 tonight 6:30-9:00 pm! P 17

Recommend


More recommend