no code implementations • 25 Feb 2023 • Yi Gao, Miao Xu, Min-Ling Zhang
Moreover, theoretical findings reveal that calculating a transition matrix from label correlations in \textit{multi-labeled CLL} (ML-CLL) needs multi-labeled data, while this is unavailable for ML-CLL.
1 code implementation • 16 Oct 2022 • Jonathan Wilton, Abigail M. Y. Koay, Ryan K. L. Ko, Miao Xu, Nan Ye
Key to our approach is a new interpretation of decision tree algorithms for positive and negative data as \emph{recursive greedy risk minimization algorithms}.
no code implementations • 13 Aug 2022 • Tong Wang, Yuan YAO, Feng Xu, Miao Xu, Shengwei An, Ting Wang
Existing defenses are mainly built upon the observation that the backdoor trigger is usually of small size or affects the activation of only a few neurons.
no code implementations • 19 May 2022 • Yawen Zhao, Mingzhe Zhang, Chenhao Zhang, Weitong Chen, Nan Ye, Miao Xu
This is because AdaPU learns a weak classifier and its weight using a weighted positive-negative (PN) dataset with some negative data weights $-$ the dataset is derived from the original PU data, and the data weights are determined by the current weighted classifier combination, but some data weights are negative.
1 code implementation • 9 May 2022 • Yueying Kao, Bowen Pan, Miao Xu, Jiangjing Lyu, Xiangyu Zhu, Yuanzhang Chang, Xiaobo Li, Zhen Lei
In 3D face reconstruction, orthogonal projection has been widely employed to substitute perspective projection to simplify the fitting process.
no code implementations • 17 Dec 2021 • Guanhua Ye, Hongzhi Yin, Tong Chen, Miao Xu, Quoc Viet Hung Nguyen, Jiangning Song
Actuated by the growing attention to personal healthcare and the pandemic, the popularity of E-health is proliferating.
no code implementations • 29 Sep 2021 • Cheng-Yu Hsieh, Wei-I Lin, Miao Xu, Gang Niu, Hsuan-Tien Lin, Masashi Sugiyama
The goal of multi-label learning (MLL) is to associate a given instance with its relevant labels from a set of concepts.
no code implementations • 11 Jun 2021 • Jiaqi Lv, Biao Liu, Lei Feng, Ning Xu, Miao Xu, Bo An, Gang Niu, Xin Geng, Masashi Sugiyama
Partial-label learning (PLL) utilizes instances with PLs, where a PL includes several candidate labels but only one is the true label (TL).
1 code implementation • 11 Jun 2021 • Chao Wen, Miao Xu, Zhilin Zhang, Zhenzhe Zheng, Yuhui Wang, Xiangyu Liu, Yu Rong, Dong Xie, Xiaoyang Tan, Chuan Yu, Jian Xu, Fan Wu, Guihai Chen, Xiaoqiang Zhu, Bo Zheng
Third, to deploy MAAB in the large-scale advertising system with millions of advertisers, we propose a mean-field approach.
no code implementations • 10 Mar 2021 • Changwei Zou, Zhenqi Hao, Xiangyu Luo, Shusen Ye, Qiang Gao, Xintong Li, Miao Xu, Peng Cai, Chengtian Lin, Xingjiang Zhou, Dung-Hai Lee, Yayu Wang
To elucidate the superconductor to metal transition at the end of superconducting dome, the overdoped regime has stepped onto the center stage of cuprate research recently.
Superconductivity
no code implementations • 5 Dec 2020 • Zhilin Zhang, Xiangyu Liu, Zhenzhe Zheng, Chenrui Zhang, Miao Xu, Junwei Pan, Chuan Yu, Fan Wu, Jian Xu, Kun Gai
In e-commerce advertising, the ad platform usually relies on auction mechanisms to optimize different performance metrics, such as user experience, advertiser utility, and platform revenue.
1 code implementation • NeurIPS 2020 • Long Chen, Yuan YAO, Feng Xu, Miao Xu, Hanghang Tong
Collaborative filtering has been widely used in recommender systems.
no code implementations • 5 Oct 2020 • Lei Feng, Senlin Shu, Nan Lu, Bo Han, Miao Xu, Gang Niu, Bo An, Masashi Sugiyama
To alleviate the data requirement for training effective binary classifiers in binary classification, many weakly supervised learning settings have been proposed.
no code implementations • 3 Sep 2020 • Zhaoqing Peng, Junqi Jin, Lan Luo, Yaodong Yang, Rui Luo, Jun Wang, Wei-Nan Zhang, Haiyang Xu, Miao Xu, Chuan Yu, Tiejian Luo, Han Li, Jian Xu, Kun Gai
To drive purchase in online advertising, it is of the advertiser's great interest to optimize the sequential advertising strategy whose performance and interpretability are both important.
no code implementations • NeurIPS 2020 • Lei Feng, Jiaqi Lv, Bo Han, Miao Xu, Gang Niu, Xin Geng, Bo An, Masashi Sugiyama
Partial-label learning (PLL) is a multi-class classification problem, where each training example is associated with a set of candidate labels.
no code implementations • ICML 2020 • Xiaotian Hao, Zhaoqing Peng, Yi Ma, Guan Wang, Junqi Jin, Jianye Hao, Shan Chen, Rongquan Bai, Mingzhou Xie, Miao Xu, Zhenzhe Zheng, Chuan Yu, Han Li, Jian Xu, Kun Gai
In E-commerce, advertising is essential for merchants to reach their target users.
1 code implementation • ICML 2020 • Jiaqi Lv, Miao Xu, Lei Feng, Gang Niu, Xin Geng, Masashi Sugiyama
Partial-label learning (PLL) is a typical weakly supervised learning problem, where each training instance is equipped with a set of candidate labels among which only one is the true label.
no code implementations • ICLR Workshop LLD 2019 • Cheng-Yu Hsieh, Miao Xu, Gang Niu, Hsuan-Tien Lin, Masashi Sugiyama
To address the need, we propose a special weakly supervised MLL problem that not only focuses on the situation of limited fine-grained supervision but also leverages the hierarchical relationship between the coarse concepts and the fine-grained ones.
no code implementations • 29 Jan 2019 • Miao Xu, Bingcong Li, Gang Niu, Bo Han, Masashi Sugiyama
May there be a new sample selection method that can outperform the latest importance reweighting method in the deep learning age?
1 code implementation • ICML 2020 • Bo Han, Gang Niu, Xingrui Yu, Quanming Yao, Miao Xu, Ivor Tsang, Masashi Sugiyama
Given data with noisy labels, over-parameterized deep networks can gradually memorize the data, and fit everything in the end.
no code implementations • 27 Sep 2018 • Bo Han, Gang Niu, Jiangchao Yao, Xingrui Yu, Miao Xu, Ivor Tsang, Masashi Sugiyama
To handle these issues, by using the memorization effects of deep neural networks, we may train deep neural networks on the whole dataset only the first few iterations.
no code implementations • 13 Sep 2018 • Takeshi Teshima, Miao Xu, Issei Sato, Masashi Sugiyama
On the other hand, matrix completion (MC) methods can recover a low-rank matrix from various information deficits by using the principle of low-rank completion.
no code implementations • 23 May 2018 • Miao Xu, Gang Niu, Bo Han, Ivor W. Tsang, Zhi-Hua Zhou, Masashi Sugiyama
We consider a challenging multi-label classification problem where both feature matrix $\X$ and label matrix $\Y$ have missing entries.
5 code implementations • NeurIPS 2018 • Bo Han, Quanming Yao, Xingrui Yu, Gang Niu, Miao Xu, Weihua Hu, Ivor Tsang, Masashi Sugiyama
Deep learning with noisy labels is practically challenging, as the capacity of deep models is so high that they can totally memorize these noisy labels sooner or later during training.
Ranked #7 on
Learning with noisy labels
on CIFAR-10N-Random3
no code implementations • 15 Feb 2018 • Sheng-Jun Huang, Miao Xu, Ming-Kun Xie, Masashi Sugiyama, Gang Niu, Songcan Chen
Feature missing is a serious problem in many applications, which may lead to low quality of training data and further significantly degrade the learning performance.
no code implementations • 4 Nov 2014 • Miao Xu, Rong Jin, Zhi-Hua Zhou
In particular, the proposed algorithm computes the low rank approximation of the target matrix based on (i) the randomly sampled rows and columns, and (ii) a subset of observed entries that are randomly sampled from the matrix.
no code implementations • NeurIPS 2013 • Miao Xu, Rong Jin, Zhi-Hua Zhou
In standard matrix completion theory, it is required to have at least $O(n\ln^2 n)$ observed entries to perfectly recover a low-rank matrix $M$ of size $n\times n$, leading to a large number of observations when $n$ is large.