Dependency Grammar Induction
3 papers with code • 2 benchmarks • 0 datasets
Also known as "unsupervised dependency parsing"
Latest papers with no code
Enhancing Unsupervised Generative Dependency Parser with Contextual Information
In this paper, we propose a novel probabilistic model called discriminative neural dependency model with valence (D-NDMV) that generates a sentence and its parse from a continuous latent representation, which encodes global contextual information of the generated sentence.
Dependency Grammar Induction with a Neural Variational Transition-based Parser
Transition-based models enable faster inference with $O(n)$ time complexity, but their performance still lags behind.
Combining Generative and Discriminative Approaches to Unsupervised Dependency Parsing via Dual Decomposition
Unsupervised dependency parsing aims to learn a dependency parser from unannotated sentences.
Dependency Grammar Induction with Neural Lexicalization and Big Training Data
We study the impact of big models (in terms of the degree of lexicalization) and big data (in terms of the training corpus size) on dependency grammar induction.
Matroids Hitting Sets and Unsupervised Dependency Grammar Induction
This paper formulates a novel problem on graphs: find the minimal subset of edges in a fully connected graph, such that the resulting graph contains all spanning trees for a set of specifed sub-graphs.