Recurrent Neural Network Grammars

We introduce recurrent neural network grammars, probabilistic models of sentences with explicit phrase structure. We explain efficient inference procedures that allow application to both parsing and language modeling. Experiments show that they provide better parsing in English than any single previously published supervised generative model and better language modeling than state-of-the-art sequential RNNs in English and Chinese.

PDF Abstract NAACL 2016 PDF NAACL 2016 Abstract

Datasets


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Constituency Parsing Penn Treebank RNN Grammar F1 score 93.3 # 25

Methods


No methods listed for this paper. Add relevant methods here