Operation-guided Neural Networks for High Fidelity Data-To-Text Generation
- Feng Nie ,
- Jinpeng Wang ,
- Jin-Ge Yao ,
- Rong Pan ,
- Chin-Yew Lin
Empirical Methods in Natural Language Processing |
Published by Association for Computational Linguistics
DOI | Publication | Publication | Publication | Publication
Recent neural models for data-to-text generation are mostly based on data-driven end-to-end training over encoder-decoder networks. Even though the generated texts are mostly fluent and informative, they often generate descriptions that are not consistent with the input structured data. This is a critical issue especially in domains that require inference or calculations over raw data. In this paper, we attempt to improve the fidelity of neural data-to-text generation by utilizing pre-executed symbolic operations. We propose a framework called Operation-guided Attention-based sequence-to-sequence network (OpAtt), with a specifically designed gating mechanism as well as a quantization module for operation results to utilize information from pre-executed operations. Experiments on two sports datasets show our proposed method clearly improves the fidelity of the generated texts to the input structured data.