Adversarially Regularized Autoencoders
Junbo (Jake) Zhao, Yoon Kim, Kelly Zhang, Alexander M. Rush, Yann LeCun
1
Wei Zhen Teoh and Mathieu Ravaut
Adversarially Regularized Autoencoders Junbo (Jake) Zhao, Yoon Kim, - - PowerPoint PPT Presentation
Adversarially Regularized Autoencoders Junbo (Jake) Zhao, Yoon Kim, Kelly Zhang, Alexander M. Rush, Yann LeCun Wei Zhen Teoh and Mathieu Ravaut 1 Refresh: Adversarial Autoencoder [From Adversarial Autoencoders by Makhzani et al 2015] 2 Some
1
Wei Zhen Teoh and Mathieu Ravaut
2
[From Adversarial Autoencoders by Makhzani et al 2015]
3
4
[From Wasserstein GAN by Arjovsky et al 2017]
5
6 [From https://ayearofai.com/lenny-2-autoencoders-and-word
7 [From https://mlalgorithm.wordpress.com/2016/08/04/deep-learning-part-2-recurrent-neural-networks-rnn/]
8
Reconstruction loss Wasserstein distance between two distributions
9
10
The max of this function approximates the Wasserstein distance
11
12
13
14
15
sentiment attribute
16
Classifier:
17
18
AE: WGAN:
19
EM distance
[From Adversarially Regularized Autoencoders by Zhao et al, 2017]
AE: WGAN:
’
20
EM distance
[Partly from https://blog.statsbot.co/time-series-prediction-using-recurrent-neural-networks-lstms-807fa6ca7f]
AE: WGAN:
21
EM distance One decoder per class
22
[From Adversarially Regularized Autoencoders by Zhao et al, 2017]
23
[From Adversarially Regularized Autoencoders by Zhao et al, 2017]
24
[From Adversarially Regularized Autoencoders by Zhao et al, 2017]
25
[From Adversarially Regularized Autoencoders by Zhao et al, 2017]
26
Decode positive sentences Decode negative sentences Encode all sentences
[Partly from https://blog.statsbot.co/time-series-prediction-using-recurrent-neural-networks-lstms-807fa6ca7f]
27
[From Adversarially Regularized Autoencoders by Zhao et al, 2017]
28
[From Adversarially Regularized Autoencoders by Zhao et al, 2017]
29
[From Adversarially Regularized Autoencoders by Zhao et al, 2017]
30