Search Results for author: Dingquan Wang

Found 6 papers, 2 papers with code

A Generative Model for Punctuation in Dependency Trees

no code implementations TACL 2019 Xiang Lisa Li, Dingquan Wang, Jason Eisner

When the tree's yield is rendered as a written sentence, a string rewriting mechanism transduces the underlying marks into "surface" marks, which are part of the observed (surface) string but should not be regarded as part of the tree.

Punctuation Restoration Sentence

Surface Statistics of an Unknown Language Indicate How to Parse It

no code implementations TACL 2018 Dingquan Wang, Jason Eisner

We show experimentally across multiple languages: (1) Features computed from the unparsed corpus improve parsing accuracy.

Dependency Parsing POS

A Multi-task Learning Approach to Adapting Bilingual Word Embeddings for Cross-lingual Named Entity Recognition

no code implementations IJCNLP 2017 Dingquan Wang, Nanyun Peng, Kevin Duh

We show how to adapt bilingual word embeddings (BWE{'}s) to bootstrap a cross-lingual name-entity recognition (NER) system in a language with no labeled data.

Cross-Lingual Transfer Multi-Task Learning +4

Fine-Grained Prediction of Syntactic Typology: Discovering Latent Structure with Supervised Learning

no code implementations TACL 2017 Dingquan Wang, Jason Eisner

We show how to predict the basic word-order facts of a novel language given only a corpus of part-of-speech (POS) sequences.

POS

The Galactic Dependencies Treebanks: Getting More Data by Synthesizing New Languages

1 code implementation TACL 2016 Dingquan Wang, Jason Eisner

We release Galactic Dependencies 1. 0---a large set of synthetic languages not found on Earth, but annotated in Universal Dependencies format.

Cannot find the paper you are looking for? You can Submit a new open access paper.