Skip to content

An implementation of 2 hidden layer neural network (using numpy) to test MNIST dateset

License

Notifications You must be signed in to change notification settings

deepakrana47/Multi-layer-precertron

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 

Repository files navigation

Multilayer-precertron

An implementation of 2 hidden layer neural network using numpy and mnist dataset from tensorflow.

The Multilayer preceptron consist of 1 input layer, 2 hidden layer and 1 output layer.

Input layer = 784 neurons;

1-hidden layer = 256 neurons; relu {activation function};

2-hidden layer = 256 neurons; relu {activation function};

output layer = 10 neurons; softmax {activation function}; cross entropy {loss function};

weight-optimization = adam;

Taken from tensorflow example is at link: https://github.com/aymericdamien/TensorFlow-Examples/blob/master/examples/3_NeuralNetworks/multilayer_perceptron.py

About

An implementation of 2 hidden layer neural network (using numpy) to test MNIST dateset

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages