Constituency Grammar Induction
15 papers with code • 1 benchmarks • 1 datasets
Inducing a constituency-based phrase structure grammar.
Latest papers
Simple Hardware-Efficient PCFGs with Independent Left and Right Productions
Scaling dense PCFGs to thousands of nonterminals via a low-rank parameterization of the rule probability tensor has been shown to be beneficial for unsupervised parsing.
Ensemble Distillation for Unsupervised Constituency Parsing
We investigate the unsupervised constituency parsing task, which organizes words and phrases of a sentence into a hierarchical structure without using linguistically annotated data.
Augmenting Transformers with Recursively Composed Multi-grained Representations
More interestingly, the hierarchical structures induced by ReCAT exhibit strong consistency with human-annotated syntactic trees, indicating good interpretability brought by the CIO layers.
Dynamic Programming in Rank Space: Scaling Structured Inference with Low-Rank HMMs and PCFGs
Recent research found it beneficial to use large state spaces for HMMs and PCFGs.
Co-training an Unsupervised Constituency Parser with Weak Supervision
We introduce a method for unsupervised parsing that relies on bootstrapping classifiers to identify if a node dominates a specific span in a sentence.
Dependency Induction Through the Lens of Visual Perception
Our experiments find that concreteness is a strong indicator for learning dependency grammars, improving the direct attachment score (DAS) by over 50\% as compared to state-of-the-art models trained on pure text.
Neural Bi-Lexicalized PCFG Induction
Neural lexicalized PCFGs (L-PCFGs) have been shown effective in grammar induction.
PCFGs Can Do Better: Inducing Probabilistic Context-Free Grammars with Many Symbols
In this work, we present a new parameterization form of PCFGs based on tensor decomposition, which has at most quadratic computational complexity in the symbol number and therefore allows us to use a much larger number of symbols.
Visually Grounded Compound PCFGs
In this work, we study visually grounded grammar induction and learn a constituency parser from both unlabeled text and its visual groundings.
Compound Probabilistic Context-Free Grammars for Grammar Induction
We study a formalization of the grammar induction problem that models sentences as being generated by a compound probabilistic context-free grammar.