Back
Tags: #neural networks
-   
Conditional Generative Adversarial Nets in TensorFlow
Having seen GAN, VAE, and CVAE model, it is only proper to study the Conditional GAN model next!
 -   
Residual Net
In this post, we will look into the record breaking convnet model of 2015: the Residual Net (ResNet).
 -   
Generative Adversarial Nets in TensorFlow
Let's try to implement Generative Adversarial Nets (GAN), first introduced by Goodfellow et al, 2014, with TensorFlow. We'll use MNIST ...
 -   
Deriving LSTM Gradient for Backpropagation
Deriving neuralnet gradient is an absolutely great exercise to understand backpropagation and computational graph better. In this post we will ...
 -   
Convnet: Implementing Maxpool Layer with Numpy
Another important building block in convnet is the pooling layer. Nowadays, the most widely used is the max pool layer. ...
 -   
Convnet: Implementing Convolution Layer with Numpy
Convnet is dominating the world of computer vision right now. What make it special of course the convolution layer, hence ...
 -   
Implementing BatchNorm in Neural Net
BatchNorm is a relatively new technique for training neural net. It gaves us a lot of relaxation when initializing the ...
 -   
Implementing Dropout in Neural Net
Dropout is one simple way to regularize a neural net model. This is one of the recent advancements in Deep ...
 -   
Beyond SGD: Gradient Descent with Momentum and Adaptive Learning Rate
There are many attempts to improve Gradient Descent: some add momentum, some add adaptive learning rate. Let's see what's out ...
 -   
Implementing Minibatch Gradient Descent for Neural Networks
Let's use Python and Numpy to implement Minibatch Gradient Descent algorithm for a simple 3-layers Neural Networks.