A minimal and elaborately commented implementation of a recurrent neural network with GRUs (Gated Recurrent Units, Cho et al.) applied to predict the next character in a document given a series of preceding characters in a similar way as Andrej Karpathy's minimal ordinary RNN implementation.
To run this implementation an install of Python 3.x and Numpy is required. Running the main.py
script will start adapting the model to predict character sequences in input.txt
through backpropagation. Once every 100 model updates the model will sample and print a piece of fully predicted text.
The given input.txt
contains 40 paragraphs of lorem ipsum. After training, the resulting model produces fun pseudo-Latin.