Deep Residual Networks with Exponential Linear Unit

14 Apr 2016  ·  Anish Shah, Eashan Kadam, Hena Shah, Sameer Shinde, Sandip Shingade ·

Very deep convolutional neural networks introduced new problems like vanishing gradient and degradation. The recent successful contributions towards solving these problems are Residual and Highway Networks. These networks introduce skip connections that allow the information (from the input or those learned in earlier layers) to flow more into the deeper layers. These very deep models have lead to a considerable decrease in test errors, on benchmarks like ImageNet and COCO. In this paper, we propose the use of exponential linear unit instead of the combination of ReLU and Batch Normalization in Residual Networks. We show that this not only speeds up learning in Residual Networks but also improves the accuracy as the depth increases. It improves the test error on almost all data sets, like CIFAR-10 and CIFAR-100

PDF Abstract

Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Image Classification CIFAR-10 ResNet+ELU Percentage correct 94.4 # 140
Image Classification CIFAR-100 ResNet+ELU Percentage correct 73.5 # 153

Methods