Browse > Natural Language Processing > Constituency Parsing

# Constituency Parsing Edit

21 papers with code · Natural Language Processing

Consituency parsing aims to extract a constituency-based parse tree from a sentence that represents its syntactic structure according to a phrase structure grammar.

Example:

             Sentence (S)
|
+-------------+------------+
|                          |
Noun (N)                Verb Phrase (VP)
|                          |
John                 +-------+--------+
|                |
Verb (V)         Noun (N)
|                |
sees              Bill


Recent approaches convert the parse tree into a sequence following a depth-first traversal in order to be able to apply sequence-to-sequence models to it. The linearized version of the above parse tree looks as follows: (S (N) (VP V N)).

Trend Dataset Best Method Paper title Paper Code Compare

# Attention Is All You Need

The dominant sequence transduction models are based on complex recurrent or convolutional neural networks in an encoder-decoder configuration.

8,377

# Grammar as a Foreign Language

Syntactic constituency parsing is a fundamental problem in natural language processing and has been the subject of intensive research and engineering for decades.

966

458

# YellowFin and the Art of Momentum Tuning

We revisit the momentum SGD algorithm and show that hand-tuning a single learning rate and momentum makes it competitive with Adam.

384

# Multilingual Constituency Parsing with Self-Attention and Pre-Training

31 Dec 2018nikitakit/self-attentive-parser

We show that constituency parsing benefits from unsupervised pre-training across a variety of languages and a range of pre-training conditions.

305

# Constituency Parsing with a Self-Attentive Encoder

We demonstrate that replacing an LSTM encoder with a self-attentive architecture can lead to improvements to a state-of-the-art discriminative constituency parser.

305

# Scheduled Sampling for Sequence Prediction with Recurrent Neural Networks

Recurrent Neural Networks can be trained to produce sequences of tokens given some input, as exemplified by recent results in machine translation and image captioning.

213

# Recurrent Neural Network Grammars

We introduce recurrent neural network grammars, probabilistic models of sentences with explicit phrase structure.

145

# Span-Based Constituency Parsing with a Structure-Label System and Provably Optimal Dynamic Oracles

Parsing accuracy using efficient greedy transition systems has improved dramatically in recent years thanks to neural networks.

44

44