Noisy Concurrent Training for Efficient Learning under Label Noise

17 Sep 2020  ·  Fahad Sarfraz, Elahe Arani, Bahram Zonooz ·

Deep neural networks (DNNs) fail to learn effectively under label noise and have been shown to memorize random labels which affect their generalization performance. We consider learning in isolation, using one-hot encoded labels as the sole source of supervision, and a lack of regularization to discourage memorization as the major shortcomings of the standard training procedure. Thus, we propose Noisy Concurrent Training (NCT) which leverages collaborative learning to use the consensus between two models as an additional source of supervision. Furthermore, inspired by trial-to-trial variability in the brain, we propose a counter-intuitive regularization technique, target variability, which entails randomly changing the labels of a percentage of training samples in each batch as a deterrent to memorization and over-generalization in DNNs. Target variability is applied independently to each model to keep them diverged and avoid the confirmation bias. As DNNs tend to prioritize learning simple patterns first before memorizing the noisy labels, we employ a dynamic learning scheme whereby as the training progresses, the two models increasingly rely more on their consensus. NCT also progressively increases the target variability to avoid memorization in later stages. We demonstrate the effectiveness of our approach on both synthetic and real-world noisy benchmark datasets.

PDF Abstract
Task Dataset Model Metric Name Metric Value Global Rank Benchmark
Image Classification mini WebVision 1.0 NCT (Inception-ResNet-v2) Top-1 Accuracy 75.16 # 34
Top-5 Accuracy 90.77 # 23
ImageNet Top-1 Accuracy 71.73 # 25
ImageNet Top-5 Accuracy 91.61 # 20

Methods


No methods listed for this paper. Add relevant methods here