Skip to content

Latest commit

 

History

History
29 lines (20 loc) · 1.42 KB

README.md

File metadata and controls

29 lines (20 loc) · 1.42 KB

P2_WebNLG2020

This is the GitHub repo for our paper "P2: A Plan-and-Pretrain Approach for Knowledge Graph-to-Text Generation" by Qipeng Guo, Zhijing Jin, Ning Dai, Xipeng Qiu, Xiangyang Xue, David Wipf, and Zheng Zhang.

Our model achieves the top #1 performance at the English track of the WebNLG 2020 Challenge at INLG 2020 Workshop.

Model Introduction

Our P2 model consists of two steps:

Codes

Run the run.sh for the training and the fix_nonenglish.py is a post-process script to map the character back to the original non-english one.

Our model output on WebNLG 2020 test set is available at output.txt.

If you have any question, please feel free to email the first author, Qipeng Guo, by [email protected].

Citation

@article{guo2020p2,
  title={P2: A Plan-and-Pretrain Approach for Knowledge Graph-to-Text Generation},
  author={Qipeng Guo, Zhijing Jin, Ning Dai, Xipeng Qiu, Xiangyang Xue, David Wipf, and Zheng Zhang},
  year={2017}
}