Transition-Based Deep Input Linearization

EACL 2017  ·  Ratish Puduppully, Yue Zhang, Manish Shrivastava ·

Traditional methods for deep NLG adopt pipeline approaches comprising stages such as constructing syntactic input, predicting function words, linearizing the syntactic input and generating the surface forms. Though easier to visualize, pipeline approaches suffer from error propagation... In addition, information available across modules cannot be leveraged by all modules. We construct a transition-based model to jointly perform linearization, function word prediction and morphological generation, which considerably improves upon the accuracy compared to a pipelined baseline system. On a standard deep input linearization shared task, our system achieves the best results reported so far. read more

PDF Abstract EACL 2017 PDF EACL 2017 Abstract


  Add Datasets introduced or used in this paper

Results from the Paper

Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Data-to-Text Generation SR11Deep Transition based Deep Input Linearization BLEU 80.49 # 1


No methods listed for this paper. Add relevant methods here