Syntax Aware LSTM model for Semantic Role Labeling

WS 2017  ·  Feng Qian, Lei Sha, Baobao Chang, Lu-chen Liu, Ming Zhang ·

In Semantic Role Labeling (SRL) task, the tree structured dependency relation is rich in syntax information, but it is not well handled by existing models. In this paper, we propose Syntax Aware Long Short Time Memory (SA-LSTM). The structure of SA-LSTM changes according to dependency structure of each sentence, so that SA-LSTM can model the whole tree structure of dependency relation in an architecture engineering way. Experiments demonstrate that on Chinese Proposition Bank (CPB) 1.0, SA-LSTM improves F1 by 2.06{\%} than ordinary bi-LSTM with feature engineered dependency relation information, and gives state-of-the-art F1 of 79.92{\%}. On English CoNLL 2005 dataset, SA-LSTM brings improvement (2.1{\%}) to bi-LSTM model and also brings slight improvement (0.3{\%}) when added to the state-of-the-art model.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here