Browse > Natural Language Processing > Constituency Parsing > Constituency Grammar Induction

Constituency Grammar Induction

6 papers with code · Natural Language Processing

Leaderboards

Greatest papers with code

Unsupervised Recurrent Neural Network Grammars

NAACL 2019 harvardnlp/urnng

On language modeling, unsupervised RNNGs perform as well their supervised counterparts on benchmarks in English and Chinese.

#4 best model for Constituency Grammar Induction on PTB (Max F1 (WSJ) metric)

CONSTITUENCY GRAMMAR INDUCTION LANGUAGE MODELLING

Compound Probabilistic Context-Free Grammars for Grammar Induction

ACL 2019 harvardnlp/compound-pcfg

We study a formalization of the grammar induction problem that models sentences as being generated by a compound probabilistic context-free grammar.

CONSTITUENCY GRAMMAR INDUCTION

Unsupervised Learning of Syntactic Structure with Invertible Neural Projections

EMNLP 2018 jxhe/struct-learning-with-flow

In this work, we propose a novel generative model that jointly learns discrete syntactic structure and continuous word representations in an unsupervised fashion by cascading an invertible neural network with a structured generative prior.

CONSTITUENCY GRAMMAR INDUCTION DEPENDENCY PARSING

Unsupervised Latent Tree Induction with Deep Inside-Outside Recursive Auto-Encoders

NAACL 2019 iesl/diora

We introduce the deep inside-outside recursive autoencoder (DIORA), a fully-unsupervised method for discovering syntax that simultaneously learns representations for constituents within the induced tree.

CONSTITUENCY GRAMMAR INDUCTION

Neural Language Modeling by Jointly Learning Syntax and Lexicon

ICLR 2018 nyu-mll/PRPN-Analysis

In this paper, We propose a novel neural language model, called the Parsing-Reading-Predict Networks (PRPN), that can simultaneously induce the syntactic structure from unannotated sentences and leverage the inferred structure to learn a better language model.

CONSTITUENCY GRAMMAR INDUCTION LANGUAGE MODELLING