Constituency Grammar Induction

15 papers with code • 1 benchmarks • 1 datasets

Inducing a constituency-based phrase structure grammar.

Simple Hardware-Efficient PCFGs with Independent Left and Right Productions

sustcsonglin/TN-PCFG 23 Oct 2023

Scaling dense PCFGs to thousands of nonterminals via a low-rank parameterization of the rule probability tensor has been shown to be beneficial for unsupervised parsing.

41
23 Oct 2023

Ensemble Distillation for Unsupervised Constituency Parsing

manga-uofa/ed4ucp 3 Oct 2023

We investigate the unsupervised constituency parsing task, which organizes words and phrases of a sentence into a hierarchical structure without using linguistically annotated data.

3
03 Oct 2023

Augmenting Transformers with Recursively Composed Multi-grained Representations

ant-research/structuredlm_rtdt 28 Sep 2023

More interestingly, the hierarchical structures induced by ReCAT exhibit strong consistency with human-annotated syntactic trees, indicating good interpretability brought by the CIO layers.

64
28 Sep 2023

Dynamic Programming in Rank Space: Scaling Structured Inference with Low-Rank HMMs and PCFGs

sustcsonglin/TN-PCFG NAACL 2022

Recent research found it beneficial to use large state spaces for HMMs and PCFGs.

41
01 May 2022

Co-training an Unsupervised Constituency Parser with Weak Supervision

Nickil21/weakly-supervised-parsing Findings (ACL) 2022

We introduce a method for unsupervised parsing that relies on bootstrapping classifiers to identify if a node dominates a specific span in a sentence.

4
05 Oct 2021

Dependency Induction Through the Lens of Visual Perception

ruisi-su/concrete_dep CoNLL (EMNLP) 2021

Our experiments find that concreteness is a strong indicator for learning dependency grammars, improving the direct attachment score (DAS) by over 50\% as compared to state-of-the-art models trained on pure text.

2
20 Sep 2021

Neural Bi-Lexicalized PCFG Induction

sustcsonglin/TN-PCFG ACL 2021

Neural lexicalized PCFGs (L-PCFGs) have been shown effective in grammar induction.

41
31 May 2021

PCFGs Can Do Better: Inducing Probabilistic Context-Free Grammars with Many Symbols

sustcsonglin/TN-PCFG NAACL 2021

In this work, we present a new parameterization form of PCFGs based on tensor decomposition, which has at most quadratic computational complexity in the symbol number and therefore allows us to use a much larger number of symbols.

41
28 Apr 2021

Visually Grounded Compound PCFGs

zhaoyanpeng/vpcfg EMNLP 2020

In this work, we study visually grounded grammar induction and learn a constituency parser from both unlabeled text and its visual groundings.

38
25 Sep 2020

Compound Probabilistic Context-Free Grammars for Grammar Induction

harvardnlp/compound-pcfg ACL 2019

We study a formalization of the grammar induction problem that models sentences as being generated by a compound probabilistic context-free grammar.

127
24 Jun 2019