Original input (28x28):
Decoded output (16x1) ⟶ (28x28):
This project implements an autoencoder (AE) trained on the MNIST dataset. The AE uses no convolutional layers and the encoder and decoder parts are comprised of 2 dense layers each. A variant of the Adam optimiser is used in each implementation.
The source code is written in Python using Pytorch/Numpy, Cython and C and execution times during training will soon be compared.
- C autoencoder fork from C based MNIST classifier here
- Python autoencoder referenced from here
- Basic Neural Network in C
- Normal distribution generator for C
- PyTorch AE
- DEMON Adam optimiser
- AE implemented using Adam optimiser in C
- Implemented batch training and composite matrix operations in C
- AE implemented in PyTorch and Cython with DEMON Adam optimiser
- Add DEMON to the Adam optimiser to C source code.
- Implement a disentangled VAE in all languages.
- Test and compare execution times.