Improving Encoder by Auxiliary Supervision Tasks for Table-to-Text Generation

ACL 2021  Â·  Liang Li, Can Ma, Yinliang Yue, Dayong Hu ·

Table-to-text generation aims at automatically generating natural text to help people conveniently obtain salient information in tables. Although neural models for table-to-text have achieved remarkable progress, some problems are still overlooked. Previous methods cannot deduce the factual results from the entity{'}s (player or team) performance and the relations between entities. To solve this issue, we first build an entity graph from the input tables and introduce a reasoning module to perform reasoning on the graph. Moreover, there are different relations (e.g., the numeric size relation and the importance relation) between records in different dimensions. And these relations may contribute to the data-to-text generation. However, it is hard for a vanilla encoder to capture these. Consequently, we propose to utilize two auxiliary tasks, Number Ranking (NR) and Importance Ranking (IR), to supervise the encoder to capture the different relations. Experimental results on ROTOWIRE and RW-FG show that our method not only has a good generalization but also outperforms previous methods on several metrics: BLEU, Content Selection, Content Ordering.

PDF Abstract


Results from the Paper

Task Dataset Model Metric Name Metric Value Global Rank Benchmark
Table-to-Text Generation RotoWire HierarchicalEncoder + NR + IR BLEU 17.96 # 1
Content Ordering 25.30 # 1
Content Selection (F1) 55.88 # 1
Data-to-Text Generation RotoWire HierarchicalEncoder + NR + IR BLEU 17.96 # 1


No methods listed for this paper. Add relevant methods here