RotationOut as a Regularization Method for Neural Network

In this paper, we propose a novel regularization method, RotationOut, for neural networks. Different from Dropout that handles each neuron/channel independently, RotationOut regards its input layer as an entire vector and introduces regularization by randomly rotating the vector... (read more)

Results in Papers With Code
(↓ scroll down to see all results)