Research on data-to-text generation based on transformer model and deep neural network
CSTR:
Author:
Affiliation:

Clc Number:

TP311

Fund Project:

  • Article
  • |
  • Figures
  • |
  • Metrics
  • |
  • Reference
  • |
  • Related
  • |
  • Cited by
  • |
  • Materials
  • |
  • Comments
    Abstract:

    Data-to-text generation is a natural language processing method that generates coherent text from structured data. In recent years, data-to-text generation have shown great promise of profit due to the popular neural network architectures which are trained end-to-end. This method can automatically process large amounts of data and generate coherent text and is often used in news writing, report generation, etc. However, there are some defects in the reasoning of information such as the data of specific value and time in the existing researches, which make it unable to make full use of the structural information of data to provide reasonable guidance for the generation. Beyond that the generation process is prone to separate semantic from syntactic when training. In this paper, a data-to-text generation method based on transformer model and deep neural network was proposed, and the algorithm of transformer text planning(TTP) was also introduced so as to effectively control the context information of the generated text and remove the deficiencies of the previous model that resulted in semantics and syntax separation. Experiment results on the Rotowire public dataset show that the method proposed outperforms the existing model and it can be directly applied to the generation task of scattered data to coherent text.

    Reference
    Related
    Cited by
Get Citation

许晓泓,何霆,王华珍,陈坚.结合Transformer模型与深度神经网络的数据到文本生成方法[J].重庆大学学报,2020,43(7):91~100

Copy
Related Videos

Share
Article Metrics
  • Abstract:
  • PDF:
  • HTML:
  • Cited by:
History
  • Received:December 09,2019
  • Revised:
  • Adopted:
  • Online: July 18,2020
  • Published:
Article QR Code