How Important is Weight Symmetry in Backpropagation?

17 Oct 2015  ยท  Qianli Liao, Joel Z. Leibo, Tomaso Poggio ยท

Gradient backpropagation (BP) requires symmetric feedforward and feedback connections -- the same weights must be used for forward and backward passes. This "weight transport problem" (Grossberg 1987) is thought to be one of the main reasons to doubt BP's biologically plausibility. Using 15 different classification datasets, we systematically investigate to what extent BP really depends on weight symmetry. In a study that turned out to be surprisingly similar in spirit to Lillicrap et al.'s demonstration (Lillicrap et al. 2014) but orthogonal in its results, our experiments indicate that: (1) the magnitudes of feedback weights do not matter to performance (2) the signs of feedback weights do matter -- the more concordant signs between feedforward and their corresponding feedback connections, the better (3) with feedback weights having random magnitudes and 100% concordant signs, we were able to achieve the same or even better performance than SGD. (4) some normalizations/stabilizations are indispensable for such asymmetric BP to work, namely Batch Normalization (BN) (Ioffe and Szegedy 2015) and/or a "Batch Manhattan" (BM) update rule.

PDF Abstract

Results from the Paper


 Ranked #1 on Handwritten Digit Recognition on MNIST (PERCENTAGE ERROR metric)

     Get a GitHub badge
Task Dataset Model Metric Name Metric Value Global Rank Uses Extra
Training Data
Result Benchmark
Image Classification CIFAR-10 Sign-symmetry Percentage correct 80.98 # 213
Image Classification CIFAR-100 Sign-symmetry Percentage correct 48.75 # 192
Handwritten Digit Recognition MNIST Sign-symmetry PERCENTAGE ERROR 0.91 # 1
Image Classification STL-10 Sign-symmetry Percentage correct 57.32 # 116
Image Classification SVHN Sign-symmetry Percentage error 10.16 # 46

Methods