Research on data-to-text generation based on transformer model and deep neural network
Article
Figures
Metrics
Preview PDF
Reference
Related
Cited by
Materials
Abstract:
Data-to-text generation is a natural language processing method that generates coherent text from structured data. In recent years, data-to-text generation have shown great promise of profit due to the popular neural network architectures which are trained end-to-end. This method can automatically process large amounts of data and generate coherent text and is often used in news writing, report generation, etc. However, there are some defects in the reasoning of information such as the data of specific value and time in the existing researches, which make it unable to make full use of the structural information of data to provide reasonable guidance for the generation. Beyond that the generation process is prone to separate semantic from syntactic when training. In this paper, a data-to-text generation method based on transformer model and deep neural network was proposed, and the algorithm of transformer text planning(TTP) was also introduced so as to effectively control the context information of the generated text and remove the deficiencies of the previous model that resulted in semantics and syntax separation. Experiment results on the Rotowire public dataset show that the method proposed outperforms the existing model and it can be directly applied to the generation task of scattered data to coherent text.