Normalization

Sparse Switchable Normalization

Introduced by Shao et al. in SSN: Learning Sparse Switchable Normalization via SparsestMax

Sparse Switchable Normalization (SSN) is a variant on Switchable Normalization where the importance ratios are constrained to be sparse. Unlike $\ell_1$ and $\ell_0$ constraints that impose difficulties in optimization, the constrained optimization problem is turned into feed-forward computation through SparseMax, which is a sparse version of softmax.

Source: SSN: Learning Sparse Switchable Normalization via SparsestMax

Papers


Paper Code Results Date Stars

Categories