An implementation of 2 hidden layer neural network using numpy and mnist dataset from tensorflow.
The Multilayer preceptron consist of 1 input layer, 2 hidden layer and 1 output layer.
Input layer = 784 neurons;
1-hidden layer = 256 neurons; relu {activation function};
2-hidden layer = 256 neurons; relu {activation function};
output layer = 10 neurons; softmax {activation function}; cross entropy {loss function};
weight-optimization = adam;
Taken from tensorflow example is at link: https://github.com/aymericdamien/TensorFlow-Examples/blob/master/examples/3_NeuralNetworks/multilayer_perceptron.py