Back
Tags: #python
-
Deriving LSTM Gradient for Backpropagation
Deriving neuralnet gradient is an absolutely great exercise to understand backpropagation and computational graph better. In this post we will ...
-
Convnet: Implementing Maxpool Layer with Numpy
Another important building block in convnet is the pooling layer. Nowadays, the most widely used is the max pool layer. ...
-
Convnet: Implementing Convolution Layer with Numpy
Convnet is dominating the world of computer vision right now. What make it special of course the convolution layer, hence ...
-
Implementing BatchNorm in Neural Net
BatchNorm is a relatively new technique for training neural net. It gaves us a lot of relaxation when initializing the ...
-
Implementing Dropout in Neural Net
Dropout is one simple way to regularize a neural net model. This is one of the recent advancements in Deep ...
-
Beyond SGD: Gradient Descent with Momentum and Adaptive Learning Rate
There are many attempts to improve Gradient Descent: some add momentum, some add adaptive learning rate. Let's see what's out ...
-
Implementing Minibatch Gradient Descent for Neural Networks
Let's use Python and Numpy to implement Minibatch Gradient Descent algorithm for a simple 3-layers Neural Networks.
-
Paralellizing Monte Carlo Simulation in Python
Monte Carlo simulation is all about quantity. It can take a long time to complete. Here's how to speed it ...
-
Scrapy as a Library in Long Running Process
Scrapy is a great web crawler framework, but it's tricky to make it runs as a library in a long-running ...
-
Slice Sampling
An implementation example of Slice Sampling for a special case: unimodal distribution with known inverse PDF