/ 5 min read
Conditional Generative Adversarial Nets in TensorFlow
We have seen the Generative Adversarial Nets (GAN) model in the previous post. We have also seen the arch nemesis of GAN, the VAE and its conditional variation: Conditional VAE (CVAE). Hence, it is only proper for us to study conditional variation of GAN, called Conditional GAN or CGAN for short.
CGAN: Formulation and Architecture
Recall, in GAN, we have two neural nets: the generator
We can see it with a probabilistic point of view.
Likewise for the discriminator, now it tries to find discriminating label for
Hence, we could see that both
Now, the objective function is given by:
If we compare the above loss to GAN loss, the difference only lies in the additional parameter
The architecture of CGAN is now as follows (taken from [1]):
In contrast with the architecture of GAN, we now has an additional input layer in both discriminator net and generator net.
CGAN: Implementation in TensorFlow
Implementing CGAN is so simple that we just need to add a handful of lines to the original GAN implementation. So, here we will only look at those modifications.
The first additional code for CGAN is here:
We are adding new input to hold our variable we are conditioning our CGAN to.
Next, we add it to both our generator net and discriminator net:
The problem we have here is how to incorporate the new variable
Of course, as our inputs for
That is, we just adjust the dimensionality of our weights.
Next, we just use our new networks:
And finally, when training, we also feed the value of
As an example above, we are training our GAN with MNIST data, and the conditional variable
CGAN: Results
At test time, we want to generate new data samples with certain label. For example, we set the label to be 5, i.e. we want to generate digit “5”:
Above, we just sample
Here is the results:
Looks pretty much like digit 5, right?
If we set our one-hot vectors to have value of 1 in the 7th index:
Those results confirmed that have successfully trained our CGAN.
Conclusion
In this post, we looked at the analogue of CVAE for GAN: the Conditional GAN (CGAN). We show that to make GAN into CGAN, we just need a little modifications to our GAN implementation.
The conditional variables for CGAN, just like CVAE, could be anything. Hence it makes CGAN an interesting model to work with for data modeling.
The full code is available at my GitHub repo: https://github.com/wiseodd/generative-models.
References
- Mirza, Mehdi, and Simon Osindero. “Conditional generative adversarial nets.” arXiv preprint arXiv:1411.1784 (2014).