Search Results for author: Bingzhen Wei

Found 8 papers, 3 papers with code

Building an Ellipsis-aware Chinese Dependency Treebank for Web Text

1 code implementation LREC 2018 Xuancheng Ren, Xu sun, Ji Wen, Bingzhen Wei, Weidong Zhan, Zhiyuan Zhang

Web 2. 0 has brought with it numerous user-produced data revealing one's thoughts, experiences, and knowledge, which are a great source for many tasks, such as information extraction, and knowledge base construction.

Dependency Parsing Sentence

Training Simplification and Model Simplification for Deep Learning: A Minimal Effort Back Propagation Method

3 code implementations17 Nov 2017 Xu Sun, Xuancheng Ren, Shuming Ma, Bingzhen Wei, Wei Li, Jingjing Xu, Houfeng Wang, Yi Zhang

Based on the sparsified gradients, we further simplify the model by eliminating the rows or columns that are seldom updated, which will reduce the computational cost both in the training and decoding, and potentially accelerate decoding in real-world applications.

Deep Stacking Networks for Low-Resource Chinese Word Segmentation with Transfer Learning

no code implementations4 Nov 2017 Jingjing Xu, Xu sun, Sujian Li, Xiaoyan Cai, Bingzhen Wei

In this paper, we propose a deep stacking framework to improve the performance on word segmentation tasks with insufficient data by integrating datasets from diverse domains.

Chinese Word Segmentation Transfer Learning

Label Embedding Network: Learning Label Representation for Soft Training of Deep Networks

1 code implementation ICLR 2018 Xu Sun, Bingzhen Wei, Xuancheng Ren, Shuming Ma

We propose a method, called Label Embedding Network, which can learn label representation (label embedding) during the training process of deep networks.

Minimal Effort Back Propagation for Convolutional Neural Networks

no code implementations18 Sep 2017 Bingzhen Wei, Xu sun, Xuancheng Ren, Jingjing Xu

As traditional neural network consumes a significant amount of computing resources during back propagation, \citet{Sun2017mePropSB} propose a simple yet effective technique to alleviate this problem.

Cannot find the paper you are looking for? You can Submit a new open access paper.