Pure Python (+ numpy) implementation of chosen neural network components. PyTorch only for MNIST dataset.
- Linear layer
- ReLU
- Dropout (implemented as inverted dropout)
- CrossEntropy loss
- Mini-batch SGD
- Xavier uniform initialization
- more to come :)
MNIST results (accuracy):
- train - 96.54 %
- test - 96.32 %