robustness of conditional gans to noisy labels
play

Robustness of Conditional GANs to Noisy Labels Spotlight - PowerPoint PPT Presentation

Robustness of Conditional GANs to Noisy Labels Spotlight presentation, NeurIPS 2018 Kiran K. Thekumparampil 1 Ashish Khetan 1 Zinan Lin 2 Sewoong Oh 1 1 University of Illinois at Urbana-Champaign 2 Carnegie Mellon University Poster #5, Tue, Dec 4


  1. Robustness of Conditional GANs to Noisy Labels Spotlight presentation, NeurIPS 2018 Kiran K. Thekumparampil 1 Ashish Khetan 1 Zinan Lin 2 Sewoong Oh 1 1 University of Illinois at Urbana-Champaign 2 Carnegie Mellon University Poster #5, Tue, Dec 4 2018 Thekumparampil (UIUC) Robust Conditional GAN NeurIPS 2018 1 / 14

  2. Conditional GAN (cGAN) is vital for achieving high quality Input: Labeled real samples ( X , Y ) “Cat” cGAN Output: Fake samples for label Y Latent Code Thekumparampil (UIUC) Robust Conditional GAN NeurIPS 2018 2 / 14

  3. Conditional GAN (cGAN) is vital for achieving high quality Input: Labeled real samples ( X , Y ) “Cat” cGAN Output: Fake samples for label Y Latent Code [Brock et al. 2018] Thekumparampil (UIUC) Robust Conditional GAN NeurIPS 2018 2 / 14

  4. Conditional GAN (cGAN) is vital for achieving high quality Input: Labeled real samples ( X , Y ) “Cat” cGAN Output: Fake samples for label Y Latent Code [Brock et al. 2018] Visual quality: cGAN >> GAN [https://github.com/tensorflow/models/tree/master/research/gan] Thekumparampil (UIUC) Robust Conditional GAN NeurIPS 2018 2 / 14

  5. Conditional GAN is sensitive to noise in labels cGAN trained with noisy labels produces samples that are biased , generating examples from wrong classes, and, of lower quality (red boxes). label 0 1 2 3 4 5 6 7 8 9 noisy real data Thekumparampil (UIUC) Robust Conditional GAN NeurIPS 2018 3 / 14

  6. Conditional GAN is sensitive to noise in labels cGAN trained with noisy labels produces samples that are biased , generating examples from wrong classes, and, of lower quality (red boxes). label 0 1 2 3 4 5 6 7 8 9 noisy real data Thekumparampil (UIUC) Robust Conditional GAN NeurIPS 2018 3 / 14

  7. Conditional GAN is sensitive to noise in labels cGAN trained with noisy labels produces samples that are biased , generating examples from wrong classes, and, of lower quality (red boxes). label 0 1 2 3 4 5 6 7 8 9 noisy real data standard cGAN Thekumparampil (UIUC) Robust Conditional GAN NeurIPS 2018 3 / 14

  8. Conditional GAN is sensitive to noise in labels cGAN trained with noisy labels produces samples that are biased , generating examples from wrong classes, and, of lower quality (red boxes). label 0 1 2 3 4 5 6 7 8 9 noisy real data standard cGAN our RCGAN Thekumparampil (UIUC) Robust Conditional GAN NeurIPS 2018 3 / 14

  9. Conditional GAN (cGAN) P x real y real ˜ EIGHT adversarial D loss x z G EIGHT y Q EIGHT y min Q JS ( P || Q ) [Bora et al. 2018, Miyato et al. 2018, Sukhbaatar et al. 2015] Thekumparampil (UIUC) Robust Conditional GAN NeurIPS 2018 4 / 14

  10. Conditional GAN under noisy labeled data � P x real C ˜ EIGHT y real y real ˜ adversarial D loss x z G EIGHT y Q EIGHT y min Q JS ( � P || Q ) [Bora et al. 2018, Miyato et al. 2018, Sukhbaatar et al. 2015] Thekumparampil (UIUC) Robust Conditional GAN NeurIPS 2018 5 / 14

  11. Robust Conditional GAN (RCGAN) Architecture � P x real C ˜ EIGHT y real y real ˜ adversarial D loss x z G Projection Discriminator EIGHT y � C Q y real ˜ min Q JS ( � P || � Q ) [Bora et al. 2018, Miyato et al. 2018, Sukhbaatar et al. 2015] Thekumparampil (UIUC) Robust Conditional GAN NeurIPS 2018 6 / 14

  12. Minimizing noisy divergence minimizes true divergence Let � P & � Q be the noisy labeled versions of P & Q . Theorem 1 (Population-level Analysis) � � � �  P , � � P , � � ≤ TV ( P , Q ) ≤ M C TV  TV Q Q  � ⇒ � Q = � � � � � � � = P ⇒ Q = P � �  � � � � � �  ≤ JS ( P � Q ) ≤ M C 8 JS JS P Q P Q where TV : Total Variation, JS: Jensen-Shannon divergence and � � � � ( C − 1 ) ij � . M C � max i j Thekumparampil (UIUC) Robust Conditional GAN NeurIPS 2018 7 / 14

  13. Minimizing noisy divergence minimizes true divergence Let � P & � Q be the noisy labeled versions of P & Q . Theorem 1 (Population-level Analysis) � � � �  P , � � P , � � ≤ TV ( P , Q ) ≤ M C TV  TV Q Q  � ⇒ � Q = � � � � � � � = P ⇒ Q = P � �  � � � � � �  ≤ JS ( P � Q ) ≤ M C 8 JS JS P Q P Q where TV : Total Variation, JS: Jensen-Shannon divergence and � � � � ( C − 1 ) ij � . M C � max i j Neural Network Distance ( d F ) w.r.t a class of parametric discriminator functions F is known to generalize [Arora et al. 2017] Thekumparampil (UIUC) Robust Conditional GAN NeurIPS 2018 7 / 14

  14. Minimizing noisy divergence minimizes true divergence Let � P n & � Q n be the empirical noisy real and generated distributions. Theorem 2 (Finite Sample Analysis) If F satisfies inclusion condition , then ∃ c > 0 such that � � d F ( � P n , � d F ( � P n , � Q n ) − ǫ ≤ d F ( P , Q ) ≤ Q n ) + ǫ M C with probability at least 1 − e − p for any ε > 0 and n ≥ cp log ( pL /ǫ ) /ǫ 2 when F is L-Lipschitz in p parameters Projection Discriminator satisfies inclusion condition Thekumparampil (UIUC) Robust Conditional GAN NeurIPS 2018 8 / 14

  15. RCGAN generates correct class (MNIST) 1 . 0 0 . 8 Generator 0 . 6 Label Accuracy 0 . 4 0 . 2 − → cGAN 0 . 0 0 . 0 0 . 1 0 . 2 0 . 3 0 . 4 0 . 5 0 . 6 0 . 7 0 . 8 0 . 9 Noise Level Thekumparampil (UIUC) Robust Conditional GAN NeurIPS 2018 9 / 14

  16. RCGAN generates correct class (MNIST) − → RCGAN 1 . 0 0 . 8 Generator 0 . 6 Label Accuracy 0 . 4 0 . 2 − → cGAN 0 . 0 0 . 0 0 . 1 0 . 2 0 . 3 0 . 4 0 . 5 0 . 6 0 . 7 0 . 8 0 . 9 Noise Level Thekumparampil (UIUC) Robust Conditional GAN NeurIPS 2018 10 / 14

  17. RCGAN generates correct class (MNIST) − → RCGAN 1 . 0 0 . 8 Generator 0 . 6 Label Accuracy 0 . 4 − → RCGAN-U 0 . 2 − → cGAN 0 . 0 0 . 0 0 . 1 0 . 2 0 . 3 0 . 4 0 . 5 0 . 6 0 . 7 0 . 8 0 . 9 Noise Level Thekumparampil (UIUC) Robust Conditional GAN NeurIPS 2018 11 / 14

  18. RCGAN improves quality of samples (CIFAR-10) 8 . 2 8 . 1 8 . 0 Inception 7 . 9 Score − → RCGAN 7 . 8 7 . 7 − → cGAN 7 . 6 − → RCGAN-U 7 . 5 0 . 0 0 . 2 0 . 4 0 . 6 0 . 8 Noise Level Thekumparampil (UIUC) Robust Conditional GAN NeurIPS 2018 12 / 14

  19. RCGAN can correct noisy training labels (MNIST) 1 . 0 − → RCGAN 0 . 8 Label 0 . 6 Recovery Accuracy 0 . 4 − → RCGAN-U 0 . 2 − → cGAN 0 . 0 0 . 0 0 . 1 0 . 2 0 . 3 0 . 4 0 . 5 0 . 6 0 . 7 0 . 8 0 . 9 Noise Level Thekumparampil (UIUC) Robust Conditional GAN NeurIPS 2018 13 / 14

  20. Thank you Poster #5, Tue, Dec 04 https://github.com/POLane16/Robust-Conditional-GAN [Arora 2015] S. Arora, R. Ge, Y. Liang, T. Ma, and Y. Zhang. Generalization and equilibrium in generative adversarial nets (GANs), ICML 2018 . [Bora 2018] A. Bora, E. Price, and A. G. Dimakis. AmbientGAN: Generative models from lossy measurements, ICLR, 2018. [Brock 2018] A. Brock, J. Donahue, and K. Simonyan. Large scale gan training for high fidelity natural image synthesis, arXiv preprint arXiv:1809.11096 . [Miyato 2018] T. Miyato, and M. Koyama. cGANs with projection discriminator. ICLR, 2018. [Sukhbaatar 2015] S. Sukhbaatar, J. Bruna, M. Paluri, L. Bourdev, and R. Fergus. Training convolutional networks with noisy labels. In ICLR, Workshop, 2015. Thekumparampil (UIUC) Robust Conditional GAN NeurIPS 2018 14 / 14

Recommend


More recommend