Search Results for author: Takashi Ishida

Found 8 papers, 5 papers with code

Do We Need Zero Training Loss After Achieving Zero Training Error?

1 code implementation ICML 2020 Takashi Ishida, Ikko Yamane, Tomoya Sakai, Gang Niu, Masashi Sugiyama

We experimentally show that flooding improves performance and, as a byproduct, induces a double descent curve of the test loss.

Memorization

Learning from Complementary Labels

1 code implementation NeurIPS 2017 Takashi Ishida, Gang Niu, Weihua Hu, Masashi Sugiyama

Collecting complementary labels would be less laborious than collecting ordinary labels, since users do not have to carefully choose the correct class from a long list of candidate classes.

Classification General Classification +1

Complementary-Label Learning for Arbitrary Losses and Models

1 code implementation Proceedings of the 36th International Conference on Machine Learning, 2019 Takashi Ishida, Gang Niu, Aditya Krishna Menon, Masashi Sugiyama

In contrast to the standard classification paradigm where the true class is given to each training pattern, complementary-label learning only uses training patterns each equipped with a complementary label, which only specifies one of the classes that the pattern does not belong to.

General Classification Image Classification

LocalDrop: A Hybrid Regularization for Deep Neural Networks

no code implementations1 Mar 2021 Ziqing Lu, Chang Xu, Bo Du, Takashi Ishida, Lefei Zhang, Masashi Sugiyama

In neural networks, developing regularization algorithms to settle overfitting is one of the major study areas.

Flooding Regularization for Stable Training of Generative Adversarial Networks

no code implementations1 Nov 2023 Iu Yahiro, Takashi Ishida, Naoto Yokoya

One of the main approaches to address this problem is to modify the loss function, often using regularization terms in addition to changing the type of adversarial losses.

Image Generation

The Selected-completely-at-random Complementary Label is a Practical Weak Supervision for Multi-class Classification

no code implementations27 Nov 2023 Wei Wang, Takashi Ishida, Yu-Jie Zhang, Gang Niu, Masashi Sugiyama

Complementary-label learning is a weakly supervised learning problem in which each training example is associated with one or multiple complementary labels indicating the classes to which it does not belong.

Binary Classification Multi-class Classification +1

Cannot find the paper you are looking for? You can Submit a new open access paper.