๐ท๐ธ Implementation of a Deep Neural Network with flexibles architectures and activation functions
- FullyConnectedLayer
- ReLU (*, Rectified Linear Unit)
- Sigmoid
- Tanh
- mini-batch gradient descent
- stochastic gradient descent
- Sum of squares
This project is licensed under the terms of the MIT license, see LICENSE.