Skip to content

erikvdplas/gru-rnn

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 

Repository files navigation

GRU RNN

A minimal and elaborately commented implementation of a recurrent neural network with GRUs (Gated Recurrent Units, Cho et al.) applied to predict the next character in a document given a series of preceding characters in a similar way as Andrej Karpathy's minimal ordinary RNN implementation.

To run this implementation an install of Python 3.x and Numpy is required. Running the main.py script will start adapting the model to predict character sequences in input.txt through backpropagation. Once every 100 model updates the model will sample and print a piece of fully predicted text.

The given input.txt contains 40 paragraphs of lorem ipsum. After training, the resulting model produces fun pseudo-Latin.

About

Minimal numpy implementation of a RNN using GRUs

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages