no code implementations • 1 Jan 2021 • Xingrui Yu, Yueming Lyu, Ivor Tsang
Our method learns useful planning computations with a meaningful reward function that focuses on the resulting region of an agent executing an action.
no code implementations • 1 Jan 2021 • Yueming Lyu, Xingrui Yu, Ivor Tsang
In this work, we take an initial step to designing a simple robust layer as a lightweight plug-in for vanilla deep models.
1 code implementation • ICML 2020 • Xingrui Yu, Yueming Lyu, Ivor W. Tsang
Thus, our module provides the imitation agent both the intrinsic intention of the demonstrator and a better exploration ability, which is critical for the agent to outperform the demonstrator.
no code implementations • 23 Jan 2019 • He Zhang, Xingrui Yu, Peng Ren, Chunbo Luo, Geyong Min
The novelty of the proposed framework focuses on incorporating deep adversarial learning with statistical learning and exploiting learning based data augmentation.
3 code implementations • 14 Jan 2019 • Xingrui Yu, Bo Han, Jiangchao Yao, Gang Niu, Ivor W. Tsang, Masashi Sugiyama
Learning with noisy labels is one of the hottest problems in weakly-supervised learning.
Ranked #13 on Learning with noisy labels on CIFAR-100N
1 code implementation • ICML 2020 • Bo Han, Gang Niu, Xingrui Yu, Quanming Yao, Miao Xu, Ivor Tsang, Masashi Sugiyama
Given data with noisy labels, over-parameterized deep networks can gradually memorize the data, and fit everything in the end.
no code implementations • 27 Sep 2018 • Bo Han, Gang Niu, Jiangchao Yao, Xingrui Yu, Miao Xu, Ivor Tsang, Masashi Sugiyama
To handle these issues, by using the memorization effects of deep neural networks, we may train deep neural networks on the whole dataset only the first few iterations.
5 code implementations • NeurIPS 2018 • Bo Han, Quanming Yao, Xingrui Yu, Gang Niu, Miao Xu, Weihua Hu, Ivor Tsang, Masashi Sugiyama
Deep learning with noisy labels is practically challenging, as the capacity of deep models is so high that they can totally memorize these noisy labels sooner or later during training.
Ranked #8 on Learning with noisy labels on CIFAR-10N-Random3