EraseReLU: A Simple Way to Ease the Training of Deep Convolution Neural Networks

22 Sep 2017Xuanyi DongGuoliang KangKun ZhanYi Yang

For most state-of-the-art architectures, Rectified Linear Unit (ReLU) becomes a standard component accompanied with each layer. Although ReLU can ease the network training to an extent, the character of blocking negative values may suppress the propagation of useful information and leads to the difficulty of optimizing very deep Convolutional Neural Networks (CNNs)... (read more)

PDF Abstract

Evaluation results from the paper


Task Dataset Model Metric name Metric value Global rank Compare
Image Classification SVHN EraseReLU Percentage error 1.54 # 2