no code implementations • TACL 2019 • Xiang Lisa Li, Dingquan Wang, Jason Eisner
When the tree's yield is rendered as a written sentence, a string rewriting mechanism transduces the underlying marks into "surface" marks, which are part of the observed (surface) string but should not be regarded as part of the tree.
1 code implementation • EMNLP 2018 • Dingquan Wang, Jason Eisner
To approximately parse an unfamiliar language, it helps to have a treebank of a similar language.
no code implementations • TACL 2018 • Dingquan Wang, Jason Eisner
We show experimentally across multiple languages: (1) Features computed from the unparsed corpus improve parsing accuracy.
no code implementations • IJCNLP 2017 • Dingquan Wang, Nanyun Peng, Kevin Duh
We show how to adapt bilingual word embeddings (BWE{'}s) to bootstrap a cross-lingual name-entity recognition (NER) system in a language with no labeled data.
no code implementations • TACL 2017 • Dingquan Wang, Jason Eisner
We show how to predict the basic word-order facts of a novel language given only a corpus of part-of-speech (POS) sequences.
1 code implementation • TACL 2016 • Dingquan Wang, Jason Eisner
We release Galactic Dependencies 1. 0---a large set of synthetic languages not found on Earth, but annotated in Universal Dependencies format.