no code implementations • CVPR 2021 • Youngdong Kim, Juseung Yun, Hyounguk Shon, Junmo Kim
Based on the fact that directly providing the label to the data (Positive Learning; PL) has a risk of allowing CNNs to memorize the contaminated labels for the case of noisy data, the indirect learning approach that uses complementary labels (Negative Learning for Noisy Labels; NLNL) has proven to be highly effective in preventing overfitting to noisy data as it reduces the risk of providing faulty target.
1 code implementation • ICCV 2019 • Youngdong Kim, Junho Yim, Juseung Yun, Junmo Kim
The classical method of training CNNs is by labeling images in a supervised manner as in "input image belongs to this label" (Positive Learning; PL), which is a fast and accurate method if the labels are assigned correctly to all images.