This example is mainly to demonstrate:
- How to train an RNN with persistent state between iterations. Here it simply manages the state inside the graph.
- How to use a TF reader pipeline instead of a DataFlow, for both training & inference.
It trains an language model on PTB dataset, and reimplements an equivalent of the PTB example in tensorflow/models with its "medium" config. It has the same performance as the original example as well.
Note that the input data pipeline is completely copied from the tensorflow example.
To Train:
./PTB-LSTM.py