An ETF view of Dropout regularization

14 Oct 2018  ·  Dor Bank, Raja Giryes ·

Dropout is a popular regularization technique in deep learning. Yet, the reason for its success is still not fully understood. This paper provides a new interpretation of Dropout from a frame theory perspective. By drawing a connection to recent developments in analog channel coding, we suggest that for a certain family of autoencoders with a linear encoder, optimizing the encoder with dropout regularization leads to an equiangular tight frame (ETF). Since this optimization is non-convex, we add another regularization that promotes such structures by minimizing the cross-correlation between filters in the network. We demonstrate its applicability in convolutional and fully connected layers in both feed-forward and recurrent networks. All these results suggest that there is indeed a relationship between dropout and ETF structure of the regularized linear operations.

PDF Abstract

Datasets


Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods