Data-to-text generation is the task of generating text from a data source.
( Image credit: Data-to-Text Generation with Content Selection and Planning )
|TREND||DATASET||BEST METHOD||PAPER TITLE||PAPER||CODE||COMPARE|
This paper summarises the experimental setup and results of the first shared task on end-to-end (E2E) natural language generation (NLG) in spoken dialogue systems.
#4 best model for Data-to-Text Generation on E2E NLG Challenge
This paper describes the E2E data, a new dataset for training end-to-end, data-driven natural language generation systems in the restaurant domain, which is ten times bigger than existing, frequently used datasets in this area.
Recent neural models have shown significant progress on the problem of generating short descriptive texts conditioned on a small number of database records.
#3 best model for Data-to-Text Generation on RotoWire (Relation Generation)
We follow the step-by-step approach to neural data-to-text generation we proposed in Moryossef et al (2019), in which the generation process is divided into a text-planning stage followed by a plan-realization stage.
We propose to split the generation process into a symbolic text-planning stage that is faithful to the input, followed by a neural generation stage that focuses only on realization.
#2 best model for Data-to-Text Generation on WebNLG
Recent advances in data-to-text generation have led to the use of large-scale datasets and neural network models which are trained end-to-end, without explicitly modeling what to say and in what order.
#2 best model for Data-to-Text Generation on Rotowire (Content Selection)
Recent approaches to data-to-text generation have shown great promise thanks to the use of large-scale datasets and the application of neural network architectures which are trained end-to-end.
Learning from class-imbalanced data continues to be a common and challenging problem in supervised learning as standard classification algorithms are designed to handle balanced class distributions.