Browse > Natural Language Processing > Constituency Parsing

Constituency Parsing

26 papers with code · Natural Language Processing

Consituency parsing aims to extract a constituency-based parse tree from a sentence that represents its syntactic structure according to a phrase structure grammar.

Example:

             Sentence (S)
                 |
   +-------------+------------+
   |                          |
 Noun (N)                Verb Phrase (VP)
   |                          |
 John                 +-------+--------+
                      |                |
                    Verb (V)         Noun (N)
                      |                |
                    sees              Bill

Recent approaches convert the parse tree into a sequence following a depth-first traversal in order to be able to apply sequence-to-sequence models to it. The linearized version of the above parse tree looks as follows: (S (N) (VP V N)).

Leaderboards

Greatest papers with code

Attention Is All You Need

NeurIPS 2017 tensorflow/models

The dominant sequence transduction models are based on complex recurrent or convolutional neural networks in an encoder-decoder configuration.

CONSTITUENCY PARSING MACHINE TRANSLATION

Grammar as a Foreign Language

NeurIPS 2015 atpaino/deep-text-corrector

Syntactic constituency parsing is a fundamental problem in natural language processing and has been the subject of intensive research and engineering for decades.

CONSTITUENCY PARSING

Multilingual Constituency Parsing with Self-Attention and Pre-Training

ACL 2019 nikitakit/self-attentive-parser

We show that constituency parsing benefits from unsupervised pre-training across a variety of languages and a range of pre-training conditions.

CONSTITUENCY PARSING

Constituency Parsing with a Self-Attentive Encoder

ACL 2018 nikitakit/self-attentive-parser

We demonstrate that replacing an LSTM encoder with a self-attentive architecture can lead to improvements to a state-of-the-art discriminative constituency parser.

CONSTITUENCY PARSING

YellowFin and the Art of Momentum Tuning

ICLR 2018 JianGoForIt/YellowFin

We revisit the momentum SGD algorithm and show that hand-tuning a single learning rate and momentum makes it competitive with Adam.

CONSTITUENCY PARSING LANGUAGE MODELLING

Scheduled Sampling for Sequence Prediction with Recurrent Neural Networks

NeurIPS 2015 theamrzaki/text_summurization_abstractive_methods

Recurrent Neural Networks can be trained to produce sequences of tokens given some input, as exemplified by recent results in machine translation and image captioning.

CONSTITUENCY PARSING IMAGE CAPTIONING SPEECH RECOGNITION

Recurrent Neural Network Grammars

NAACL 2016 clab/rnng

We introduce recurrent neural network grammars, probabilistic models of sentences with explicit phrase structure.

CONSTITUENCY PARSING LANGUAGE MODELLING

Unsupervised Latent Tree Induction with Deep Inside-Outside Recursive Auto-Encoders

NAACL 2019 iesl/diora

We introduce the deep inside-outside recursive autoencoder (DIORA), a fully-unsupervised method for discovering syntax that simultaneously learns representations for constituents within the induced tree.

CONSTITUENCY GRAMMAR INDUCTION

Unsupervised Latent Tree Induction with Deep Inside-Outside Recursive Autoencoders

3 Apr 2019iesl/diora

We introduce deep inside-outside recursive autoencoders (DIORA), a fully-unsupervised method for discovering syntax that simultaneously learns representations for constituents within the induced tree.

CONSTITUENCY PARSING