I am learning pytorch and hence trying out different examples provided in the tutorial. I will keep on adding as I try out new examples.
Simple 2 layer feedforward network trained on multivarite normal distribution
Added word2vec.. specify folder containing all the documents.. content must be put in a single line. uses stochastic gradient descent...
Classifying names with 2 layer perceptron and character level RNN
Added graph convolution network proposed by thomas kiff et. al. (implementation of train-test split incomplete)
VAE - variational autoencoder that generates new words using charater-level RNNs... need to train on larger data to get meaningful results..
Deep Q network (DQN) added.. Simple 2-layer feed forward network for calcuting the value of the Q-function... Does not work but enough for understanding the technique.
vanilla GAN added
Transfer learning .... simple classifier trained on synthetic data...add a new layer to perform similar classification task but on a different but related data..
Graph Attention network
GLoVe....added