This repo is associated with the paper Transformer-Based Neural Text Generation with Syntactic Guidance
Download training and test data from here and copy the train
and test
folders into the data
folder.
If your data folder are in this directory, you can directly run the script run_syn_train.sh
:
./run_syn_train.sh
Otherwise, you need to specify --ori_dir
, --ref_dir
and --dict_dir
parameters.
The trained model will be saved in the models
folder in the name of model.<date>.best.synlvl.chkpt
.
Again, if you have data
folder in your current directory, you can directly run the script run_txt_gen_train.sh
. Otherwise you need to specify --ori_dir
, --ref_dir
and --dict_dir
parameters.
You need to edit the script run_txt_generate.sh
before using it.
First, you need to substitute the <date>
part in TXT_MODEL_PATH
to the real value. Then, you may want to specify --bpe_model_path
, --test_data_path
and --dict_path
if you do not have data
folder in the current directory.
Note that if you train the text generator on one or zero GPU, you have to delete line 49 and 51 of the file ./TextGen/Generator.py
The generated text would be saved in the folder ./generations
.
The overall operation is the same as above. The script to run is run_txt_gen_from_tmpl.sh
.