no code implementations • ACL 2019 • Bingzhen Wei, Mingxuan Wang, Hao Zhou, Junyang Lin, Jun Xie, Xu sun
Non-autoregressive translation models (NAT) have achieved impressive inference speedup.
no code implementations • 2 Sep 2018 • Bingzhen Wei, Junyang Lin
We propose a novel model for Neural Machine Translation (NMT).
no code implementations • 10 May 2018 • Bingzhen Wei, Xuancheng Ren, Xu sun, Yi Zhang, Xiaoyan Cai, Qi Su
Especially, the proposed approach improves the semantic consistency by 4\% in terms of human evaluation.
1 code implementation • LREC 2018 • Xuancheng Ren, Xu sun, Ji Wen, Bingzhen Wei, Weidong Zhan, Zhiyuan Zhang
Web 2. 0 has brought with it numerous user-produced data revealing one's thoughts, experiences, and knowledge, which are a great source for many tasks, such as information extraction, and knowledge base construction.
3 code implementations • 17 Nov 2017 • Xu Sun, Xuancheng Ren, Shuming Ma, Bingzhen Wei, Wei Li, Jingjing Xu, Houfeng Wang, Yi Zhang
Based on the sparsified gradients, we further simplify the model by eliminating the rows or columns that are seldom updated, which will reduce the computational cost both in the training and decoding, and potentially accelerate decoding in real-world applications.
no code implementations • 4 Nov 2017 • Jingjing Xu, Xu sun, Sujian Li, Xiaoyan Cai, Bingzhen Wei
In this paper, we propose a deep stacking framework to improve the performance on word segmentation tasks with insufficient data by integrating datasets from diverse domains.
1 code implementation • ICLR 2018 • Xu Sun, Bingzhen Wei, Xuancheng Ren, Shuming Ma
We propose a method, called Label Embedding Network, which can learn label representation (label embedding) during the training process of deep networks.
no code implementations • 18 Sep 2017 • Bingzhen Wei, Xu sun, Xuancheng Ren, Jingjing Xu
As traditional neural network consumes a significant amount of computing resources during back propagation, \citet{Sun2017mePropSB} propose a simple yet effective technique to alleviate this problem.