Stochastic Pooling for Regularization of Deep Convolutional Neural Networks

16 Jan 2013  ·  Matthew D. Zeiler, Rob Fergus ·

We introduce a simple and effective method for regularizing large convolutional neural networks. We replace the conventional deterministic pooling operations with a stochastic procedure, randomly picking the activation within each pooling region according to a multinomial distribution, given by the activities within the pooling region. The approach is hyper-parameter free and can be combined with other regularization approaches, such as dropout and data augmentation. We achieve state-of-the-art performance on four image datasets, relative to other approaches that do not utilize data augmentation.

PDF Abstract

Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Image Classification CIFAR-10 Stochastic Pooling Percentage correct 84.9 # 201
Top-1 Accuracy 84.9 # 33
Image Classification CIFAR-100 Stochastic Pooling Percentage correct 57.5 # 191
Image Classification SVHN Stochastic Pooling Percentage error 2.8 # 39

Methods


No methods listed for this paper. Add relevant methods here