1 code implementation • 1 Feb 2022 • Takashi Ishida, Ikko Yamane, Nontawat Charoenphakdee, Gang Niu, Masashi Sugiyama
In contrast to others, our method is model-free and even instance-free.
no code implementations • 1 Mar 2021 • Ziqing Lu, Chang Xu, Bo Du, Takashi Ishida, Lefei Zhang, Masashi Sugiyama
In neural networks, developing regularization algorithms to settle overfitting is one of the major study areas.
1 code implementation • ICML 2020 • Takashi Ishida, Ikko Yamane, Tomoya Sakai, Gang Niu, Masashi Sugiyama
We experimentally show that flooding improves performance and, as a byproduct, induces a double descent curve of the test loss.
1 code implementation • Proceedings of the 36th International Conference on Machine Learning, 2019 • Takashi Ishida, Gang Niu, Aditya Krishna Menon, Masashi Sugiyama
In contrast to the standard classification paradigm where the true class is given to each training pattern, complementary-label learning only uses training patterns each equipped with a complementary label, which only specifies one of the classes that the pattern does not belong to.
Ranked #21 on
Image Classification
on Kuzushiji-MNIST
1 code implementation • NeurIPS 2018 • Takashi Ishida, Gang Niu, Masashi Sugiyama
Can we learn a binary classifier from only positive data, without any negative data or unlabeled data?
1 code implementation • NeurIPS 2017 • Takashi Ishida, Gang Niu, Weihua Hu, Masashi Sugiyama
Collecting complementary labels would be less laborious than collecting ordinary labels, since users do not have to carefully choose the correct class from a long list of candidate classes.