This repo contains code for Data-to-text Generation with Variational Sequential Planning (Ratish Puduppully and Yao Fu and Mirella Lapata; In Transactions of the Association for Computational Linguistics (TACL)); this code is based on an earlier (version 0.9.2) fork of OpenNMT-py.
@article{puduppully-2021-seq-plan,
author = {Ratish Puduppully and Yao Fu and Mirella Lapata},
title = {Data-to-text Generation with Variational Sequential Planning},
journal = {Transactions of the Association for Computational Linguistics (to appear)},
url = {https://arxiv.org/abs/2202.13756},
year = {2022}
}
All dependencies can be installed via:
pip install -r requirements.txt
The steps for training and inference for the MLB dataset are given in README_MLB.
The links for the models are MLB, RotoWire and German RotoWire.
The model outputs are at MLB and German-RotoWire.
Part of the code is based on the Sequential Knowledge Transformer repo.