Simple Neural Network for MNIST handwritten digits recognition
Please download data from here and follow the instructions to get training and test data: http://deeplearning.net/tutorial/gettingstarted.html
Two fully connected hidden layers with ReLu and softmax activation functions (ReLu performs much better than sigmoid)
Use Xavier initialization for weights and bias initialization
First layer with 128 neurons and second with 32 layers
Batch size = 128 and number of epoches used is depend on training data size (20 is enough for most of sizes)
Achieves accuracy as high as 97% for test data