Unsupervised Dependency Parsing

4 papers with code • 1 benchmarks • 1 datasets

Unsupervised dependency parsing is the task of inferring the dependency parse of sentences without any labeled training data.

Description from NLP Progress

Most implemented papers

StructFormer: Joint Unsupervised Induction of Dependency and Constituency Structure from Masked Language Modeling

google-research/google-research ACL 2021

There are two major classes of natural language grammar -- the dependency grammar that models one-to-one correspondences between words and the constituency grammar that models the assembly of one or several corresponded words.

CRF Autoencoder for Unsupervised Dependency Parsing

caijiong/CRFAE-Dep-Parser EMNLP 2017

The encoder part of our model is discriminative and globally normalized which allows us to use rich features as well as universal linguistic priors.

Unsupervised Learning of Syntactic Structure with Invertible Neural Projections

jxhe/struct-learning-with-flow EMNLP 2018

In this work, we propose a novel generative model that jointly learns discrete syntactic structure and continuous word representations in an unsupervised fashion by cascading an invertible neural network with a structured generative prior.