no code implementations • ICCV 2023 • Xinheng Wu, Jie Lu, Zhen Fang, Guangquan Zhang
To address CAOOD, we develop meta OOD learning (MOL) by designing a learning-to-adapt diagram such that a good initialized OOD detection model is learned during the training process.
1 code implementation • 16 Oct 2022 • Tianyu Liu, Jie Lu, Zheng Yan, Guangquan Zhang
The framework offers both a guarantee of generalized performance and good accuracy.
1 code implementation • 9 Jun 2022 • Guangzhi Ma, Jie Lu, Feng Liu, Zhen Fang, Guangquan Zhang
Hence, in this paper, we propose a novel framework to address a new realistic problem called multi-class classification with imprecise observations (MCIMO), where we need to train a classifier with fuzzy-feature observations.
no code implementations • 27 Sep 2021 • Junyu Xuan, Jie Lu, Guangquan Zhang
Transfer learning where the behavior of extracting transferable knowledge from the source domain(s) and reusing this knowledge to target domain has become a research area of great interest in the field of artificial intelligence.
no code implementations • 20 Sep 2021 • Adi Lin, Jie Lu, Junyu Xuan, Fujin Zhu, Guangquan Zhang
Causal effect estimation for dynamic treatment regimes (DTRs) contributes to sequential decision making.
1 code implementation • 30 Jun 2021 • Zhen Fang, Jie Lu, Anjin Liu, Feng Liu, Guangquan Zhang
In this paper, we target a more challenging and realistic setting: open-set learning (OSL), where there exist test samples from the classes that are unseen during training.
no code implementations • 4 May 2021 • Hang Yu, Tianyu Liu, Jie Lu, Guangquan Zhang
Many methods have been proposed to detect concept drift, i. e., the change in the distribution of streaming data, due to concept drift causes a decrease in the prediction accuracy of algorithms.
1 code implementation • 7 Feb 2021 • Tianyu Liu, Jie Lu, Zheng Yan, Guangquan Zhang
By leveraging experience from previous tasks, meta-learning algorithms can achieve effective fast adaptation ability when encountering new tasks.
no code implementations • 30 Dec 2020 • Li Zhong, Zhen Fang, Feng Liu, Jie Lu, Bo Yuan, Guangquan Zhang
Experiments show that the proxy can effectively curb the increase of the combined risk when minimizing the source risk and distribution discrepancy.
1 code implementation • 9 Aug 2020 • Anjin Liu, Jie Lu, Guangquan Zhang
Our solution comprises a novel masked distance learning (MDL) algorithm to reduce the cumulative errors caused by iteratively estimating each missing value in an observation and a fuzzy-weighted frequency (FWF) method for identifying discrepancies in the data distribution.
1 code implementation • 4 Aug 2020 • Yiyang Zhang, Feng Liu, Zhen Fang, Bo Yuan, Guangquan Zhang, Jie Lu
We consider two cases of this setting, one is that the source domain only contains complementary-label data (completely complementary unsupervised domain adaptation, CC-UDA), and the other is that the source domain has plenty of complementary-label data and a small amount of true-label data (partly complementary unsupervised domain adaptation, PC-UDA).
1 code implementation • 29 Jul 2020 • Yiyang Zhang, Feng Liu, Zhen Fang, Bo Yuan, Guangquan Zhang, Jie Lu
To mitigate this problem, we consider a novel problem setting where the classifier for the target domain has to be trained with complementary-label data from the source domain and unlabeled data from the target domain named budget-friendly UDA (BFUDA).
no code implementations • 23 Jun 2020 • Li Zhong, Zhen Fang, Feng Liu, Bo Yuan, Guangquan Zhang, Jie Lu
To achieve this aim, a previous study has proven an upper bound of the target-domain risk, and the open set difference, as an important term in the upper bound, is used to measure the risk on unknown target data.
no code implementations • 13 Apr 2020 • Anjin Liu, Jie Lu, Guangquan Zhang
Concept drift refers to changes in the distribution of underlying data and is an inherent property of evolving data streams.
no code implementations • 13 Apr 2020 • Jie Lu, Anjin Liu, Fan Dong, Feng Gu, Joao Gama, Guangquan Zhang
To help researchers identify which research topics are significant and how to apply related techniques in data analysis tasks, it is necessary that a high quality, instructive review of current research developments and trends in the concept drift field is conducted.
1 code implementation • ICML 2020 • Feng Liu, Wenkai Xu, Jie Lu, Guangquan Zhang, Arthur Gretton, Danica J. Sutherland
We propose a class of kernel-based two-sample tests, which aim to determine whether two sets of samples are drawn from the same distribution.
Ranked #1 on
Two-sample testing
on HIGGS Data Set
no code implementations • 25 Sep 2019 • Feng Liu, Jie Lu, Bo Han, Gang Niu, Guangquan Zhang, Masashi Sugiyama
Hence, we consider a new, more realistic and more challenging problem setting, where classifiers have to be trained with noisy labeled data from SD and unlabeled data from TD---we name it wildly UDA (WUDA).
Unsupervised Domain Adaptation
Wildly Unsupervised Domain Adaptation
no code implementations • 1 Aug 2019 • Shan Xue, Jie Lu, Guangquan Zhang
By generating the random walks from a structural rich domain and transferring the knowledge on the random walks across domains, it enables a network representation for the structural scarce domain as well.
1 code implementation • 19 Jul 2019 • Zhen Fang, Jie Lu, Feng Liu, Junyu Xuan, Guangquan Zhang
The aim of unsupervised domain adaptation is to leverage the knowledge in a labeled (source) domain to improve a model's learning performance with an unlabeled (target) domain -- the basic strategy being to mitigate the effects of discrepancies between the two distributions.
Ranked #18 on
Domain Adaptation
on Office-Home
1 code implementation • 19 May 2019 • Feng Liu, Jie Lu, Bo Han, Gang Niu, Guangquan Zhang, Masashi Sugiyama
Hence, we consider a new, more realistic and more challenging problem setting, where classifiers have to be trained with noisy labeled data from SD and unlabeled data from TD -- we name it wildly UDA (WUDA).
Unsupervised Domain Adaptation
Wildly Unsupervised Domain Adaptation
3 code implementations • 22 Dec 2018 • Bin Wang, Jie Lu, Zheng Yan, Huaishao Luo, Tianrui Li, Yu Zheng, Guangquan Zhang
We cast the weather forecasting problem as an end-to-end deep learning problem and solve it by proposing a novel negative log-likelihood error (NLE) loss function.
no code implementations • 18 Jul 2017 • Junyu Xuan, Jie Lu, Guangquan Zhang, Richard Yi Da Xu
The cooperative hierarchical structure is a common and significant data structure observed in, or adopted by, many research areas, such as: text mining (author-paper-word) and multi-label classification (label-instance-feature).
no code implementations • 12 Jul 2015 • Junyu Xuan, Jie Lu, Guangquan Zhang, Richard Yi Da Xu, Xiangfeng Luo
Under this same framework, two classes of correlation function are proposed (1) using Bivariate beta distribution and (2) using Copula function.
no code implementations • 30 Mar 2015 • Junyu Xuan, Jie Lu, Guangquan Zhang, Richard Yi Da Xu, Xiangfeng Luo
One branch of these works is the so-called Author Topic Model (ATM), which incorporates the authors's interests as side information into the classical topic model.
no code implementations • 30 Mar 2015 • Junyu Xuan, Jie Lu, Guangquan Zhang, Richard Yi Da Xu, Xiangfeng Luo
Traditional Relational Topic Models provide a way to discover the hidden topics from a document network.