Second-Order Unsupervised Neural Dependency Parsing

COLING 2020  ·  Songlin Yang, Yong Jiang, Wenjuan Han, Kewei Tu ·

Most of the unsupervised dependency parsers are based on first-order probabilistic generative models that only consider local parent-child information. Inspired by second-order supervised dependency parsing, we proposed a second-order extension of unsupervised neural dependency models that incorporate grandparent-child or sibling information. We also propose a novel design of the neural parameterization and optimization methods of the dependency models. In second-order models, the number of grammar rules grows cubically with the increase of vocabulary size, making it difficult to train lexicalized models that may contain thousands of words. To circumvent this problem while still benefiting from both second-order parsing and lexicalization, we use the agreement-based learning framework to jointly train a second-order unlexicalized model and a first-order lexicalized model. Experiments on multiple datasets show the effectiveness of our second-order models compared with recent state-of-the-art methods. Our joint model achieves a 10% improvement over the previous state-of-the-art parser on the full WSJ test set

PDF Abstract COLING 2020 PDF COLING 2020 Abstract

Results from the Paper

Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Dependency Grammar Induction WSJ Joint training: sibling-NDMV + L-NDMV * UAS 67.5 # 1
Dependency Grammar Induction WSJ10 Joint training: sibling-NDMV + L-NDMV * UAS 79.9 # 1


No methods listed for this paper. Add relevant methods here