Constituency Parsing

48 papers with code • 3 benchmarks • 3 datasets

Constituency parsing aims to extract a constituency-based parse tree from a sentence that represents its syntactic structure according to a phrase structure grammar.

Example:

             Sentence (S)
                 |
   +-------------+------------+
   |                          |
 Noun (N)                Verb Phrase (VP)
   |                          |
 John                 +-------+--------+
                      |                |
                    Verb (V)         Noun (N)
                      |                |
                    sees              Bill

Recent approaches convert the parse tree into a sequence following a depth-first traversal in order to be able to apply sequence-to-sequence models to it. The linearized version of the above parse tree looks as follows: (S (N) (VP V N)).

Greatest papers with code

Attention Is All You Need

tensorflow/models NeurIPS 2017

The dominant sequence transduction models are based on complex recurrent or convolutional neural networks in an encoder-decoder configuration.

Abstractive Text Summarization Constituency Parsing +1

Grammar as a Foreign Language

atpaino/deep-text-corrector NeurIPS 2015

Syntactic constituency parsing is a fundamental problem in natural language processing and has been the subject of intensive research and engineering for decades.

Constituency Parsing

Multilingual Constituency Parsing with Self-Attention and Pre-Training

nikitakit/self-attentive-parser ACL 2019

We show that constituency parsing benefits from unsupervised pre-training across a variety of languages and a range of pre-training conditions.

Constituency Parsing Unsupervised Pre-training

Constituency Parsing with a Self-Attentive Encoder

nikitakit/self-attentive-parser ACL 2018

We demonstrate that replacing an LSTM encoder with a self-attentive architecture can lead to improvements to a state-of-the-art discriminative constituency parser.

Constituency Parsing

YellowFin and the Art of Momentum Tuning

JianGoForIt/YellowFin ICLR 2018

We revisit the momentum SGD algorithm and show that hand-tuning a single learning rate and momentum makes it competitive with Adam.

Constituency Parsing Language Modelling

Scheduled Sampling for Sequence Prediction with Recurrent Neural Networks

theamrzaki/text_summurization_abstractive_methods NeurIPS 2015

Recurrent Neural Networks can be trained to produce sequences of tokens given some input, as exemplified by recent results in machine translation and image captioning.

Constituency Parsing Curriculum Learning +2

Effective Self-Training for Parsing

BLLIP/bllip-parser NAACL 2006

We present a simple, but surprisingly effective, method of self-training a two-phase parser-reranker system using readily available unlabeled data.

Ranked #19 on Constituency Parsing on Penn Treebank (using extra training data)

Constituency Parsing