Constituency Grammar Induction

8 papers with code • 1 benchmarks • 1 datasets

Inducing a constituency-based phrase structure grammar.

Greatest papers with code

Unsupervised Recurrent Neural Network Grammars

harvardnlp/urnng NAACL 2019

On language modeling, unsupervised RNNGs perform as well their supervised counterparts on benchmarks in English and Chinese.

Ranked #5 on Constituency Grammar Induction on PTB (Max F1 (WSJ) metric)

Constituency Grammar Induction Language Modelling +1

Compound Probabilistic Context-Free Grammars for Grammar Induction

harvardnlp/compound-pcfg ACL 2019

We study a formalization of the grammar induction problem that models sentences as being generated by a compound probabilistic context-free grammar.

Constituency Grammar Induction Variational Inference

Unsupervised Latent Tree Induction with Deep Inside-Outside Recursive Auto-Encoders

iesl/diora NAACL 2019

We introduce the deep inside-outside recursive autoencoder (DIORA), a fully-unsupervised method for discovering syntax that simultaneously learns representations for constituents within the induced tree.

Constituency Grammar Induction

Unsupervised Learning of Syntactic Structure with Invertible Neural Projections

jxhe/struct-learning-with-flow EMNLP 2018

In this work, we propose a novel generative model that jointly learns discrete syntactic structure and continuous word representations in an unsupervised fashion by cascading an invertible neural network with a structured generative prior.

Constituency Grammar Induction Unsupervised Dependency Parsing

Neural Language Modeling by Jointly Learning Syntax and Lexicon

nyu-mll/PRPN-Analysis ICLR 2018

In this paper, We propose a novel neural language model, called the Parsing-Reading-Predict Networks (PRPN), that can simultaneously induce the syntactic structure from unannotated sentences and leverage the inferred structure to learn a better language model.

Constituency Grammar Induction Language Modelling

Visually Grounded Compound PCFGs

zhaoyanpeng/vpcfg EMNLP 2020

In this work, we study visually grounded grammar induction and learn a constituency parser from both unlabeled text and its visual groundings.

Constituency Grammar Induction Language Modelling

PCFGs Can Do Better: Inducing Probabilistic Context-Free Grammars with Many Symbols

sustcsonglin/TN-PCFG 28 Apr 2021

In this work, we present a new parameterization form of PCFGs based on tensor decomposition, which has at most quadratic computational complexity in the symbol number and therefore allows us to use a much larger number of symbols.

Constituency Grammar Induction