Transition-based Neural RST Parsing with Implicit Syntax Features

COLING 2018  ·  Nan Yu, Meishan Zhang, Guohong Fu ·

Syntax has been a useful source of information for statistical RST discourse parsing. Under the neural setting, a common approach integrates syntax by a recursive neural network (RNN), requiring discrete output trees produced by a supervised syntax parser. In this paper, we propose an implicit syntax feature extraction approach, using hidden-layer vectors extracted from a neural syntax parser. In addition, we propose a simple transition-based model as the baseline, further enhancing it with dynamic oracle. Experiments on the standard dataset show that our baseline model with dynamic oracle is highly competitive. When implicit syntax features are integrated, we are able to obtain further improvements, better than using explicit Tree-RNN.

PDF Abstract COLING 2018 PDF COLING 2018 Abstract

Datasets


Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Benchmark
Discourse Parsing RST-DT Transition-based Parser with Implicit Syntax Features RST-Parseval (Span) 85.5 # 7
RST-Parseval (Nuclearity) 73.1 # 5
RST-Parseval (Relation) 60.2 # 4
RST-Parseval (Full) 59.9 # 3

Methods


No methods listed for this paper. Add relevant methods here