Neural Bi-Lexicalized PCFG Induction

ACL 2021  ·  Songlin Yang, Yanpeng Zhao, Kewei Tu ·

Neural lexicalized PCFGs (L-PCFGs) have been shown effective in grammar induction. However, to reduce computational complexity, they make a strong independence assumption on the generation of the child word and thus bilexical dependencies are ignored. In this paper, we propose an approach to parameterize L-PCFGs without making implausible independence assumptions. Our approach directly models bilexical dependencies and meanwhile reduces both learning and representation complexities of L-PCFGs. Experimental results on the English WSJ dataset confirm the effectiveness of our approach in improving both running speed and unsupervised parsing performance.

PDF Abstract ACL 2021 PDF ACL 2021 Abstract
Task Dataset Model Metric Name Metric Value Global Rank Benchmark
Constituency Grammar Induction PTB Diagnostic ECG Database NBL-PCFG Mean F1 (WSJ) 60.4 # 6

Methods


No methods listed for this paper. Add relevant methods here