no code implementations • WS 2019 • Joseph Lee, Ziang Xie, Cindy Wang, Max Drach, Dan Jurafsky, Andrew Ng
We introduce a simple method for text style transfer that frames style transfer as denoising: we synthesize a noisy corpus and treat the source style as a noisy version of the target style.
no code implementations • NAACL 2018 • Ziang Xie, Guillaume Genthial, Stanley Xie, Andrew Ng, Dan Jurafsky
Translation-based methods for grammar correction that directly map noisy, ungrammatical text to their clean counterparts are able to correct a broad range of errors; however, such techniques are bottlenecked by the need for a large parallel corpus of noisy and clean sentence pairs.
no code implementations • 27 Nov 2017 • Ziang Xie
Deep learning methods have recently achieved great empirical success on machine translation, dialogue response generation, summarization, and other text generation tasks.
no code implementations • 7 Mar 2017 • Ziang Xie, Sida I. Wang, Jiwei Li, Daniel Lévy, Aiming Nie, Dan Jurafsky, Andrew Y. Ng
Data noising is an effective technique for regularizing neural network models.
3 code implementations • 31 Mar 2016 • Ziang Xie, Anand Avati, Naveen Arivazhagan, Dan Jurafsky, Andrew Y. Ng
Motivated by these issues, we present a neural network-based approach to language correction.
1 code implementation • 30 Jun 2014 • Andrew L. Maas, Peng Qi, Ziang Xie, Awni Y. Hannun, Christopher T. Lengerich, Daniel Jurafsky, Andrew Y. Ng
We compare standard DNNs to convolutional networks, and present the first experiments using locally-connected, untied neural networks for acoustic modeling.
Ranked #11 on Speech Recognition on swb_hub_500 WER fullSWBCH