Search Results for author: Guangquan Zhang

Found 26 papers, 11 papers with code

Online Boosting Adaptive Learning under Concept Drift for Multistream Classification

no code implementations17 Dec 2023 En Yu, Jie Lu, Bin Zhang, Guangquan Zhang

Specifically, OBAL operates in a dual-phase mechanism, in the first of which we design an Adaptive COvariate Shift Adaptation (AdaCOSA) algorithm to construct an initialized ensemble model using archived data from various source streams, thus mitigating the covariate shift while learning the dynamic correlations via an adaptive re-weighting strategy.

Meta OOD Learning for Continuously Adaptive OOD Detection

no code implementations ICCV 2023 Xinheng Wu, Jie Lu, Zhen Fang, Guangquan Zhang

To address CAOOD, we develop meta OOD learning (MOL) by designing a learning-to-adapt diagram such that a good initialized OOD detection model is learned during the training process.

Out of Distribution (OOD) Detection

Multi-class Classification with Fuzzy-feature Observations: Theory and Algorithms

1 code implementation9 Jun 2022 Guangzhi Ma, Jie Lu, Feng Liu, Zhen Fang, Guangquan Zhang

Hence, in this paper, we propose a novel framework to address a new realistic problem called multi-class classification with imprecise observations (MCIMO), where we need to train a classifier with fuzzy-feature observations.

Classification Multi-class Classification

Bayesian Transfer Learning: An Overview of Probabilistic Graphical Models for Transfer Learning

no code implementations27 Sep 2021 Junyu Xuan, Jie Lu, Guangquan Zhang

Transfer learning where the behavior of extracting transferable knowledge from the source domain(s) and reusing this knowledge to target domain has become a research area of great interest in the field of artificial intelligence.

Transfer Learning

Learning Bounds for Open-Set Learning

1 code implementation30 Jun 2021 Zhen Fang, Jie Lu, Anjin Liu, Feng Liu, Guangquan Zhang

In this paper, we target a more challenging and realistic setting: open-set learning (OSL), where there exist test samples from the classes that are unseen during training.

Learning Theory Open Set Learning +1

Automatic Learning to Detect Concept Drift

no code implementations4 May 2021 Hang Yu, Tianyu Liu, Jie Lu, Guangquan Zhang

Many methods have been proposed to detect concept drift, i. e., the change in the distribution of streaming data, due to concept drift causes a decrease in the prediction accuracy of algorithms.

Active Learning Meta-Learning

PAC-Bayes Bounds for Meta-learning with Data-Dependent Prior

1 code implementation7 Feb 2021 Tianyu Liu, Jie Lu, Zheng Yan, Guangquan Zhang

By leveraging experience from previous tasks, meta-learning algorithms can achieve effective fast adaptation ability when encountering new tasks.

Meta-Learning

How does the Combined Risk Affect the Performance of Unsupervised Domain Adaptation Approaches?

no code implementations30 Dec 2020 Li Zhong, Zhen Fang, Feng Liu, Jie Lu, Bo Yuan, Guangquan Zhang

Experiments show that the proxy can effectively curb the increase of the combined risk when minimizing the source risk and distribution discrepancy.

Unsupervised Domain Adaptation

Concept Drift Detection: Dealing with MissingValues via Fuzzy Distance Estimations

1 code implementation9 Aug 2020 Anjin Liu, Jie Lu, Guangquan Zhang

Our solution comprises a novel masked distance learning (MDL) algorithm to reduce the cumulative errors caused by iteratively estimating each missing value in an observation and a fuzzy-weighted frequency (FWF) method for identifying discrepancies in the data distribution.

Imputation

Learning from a Complementary-label Source Domain: Theory and Algorithms

1 code implementation4 Aug 2020 Yiyang Zhang, Feng Liu, Zhen Fang, Bo Yuan, Guangquan Zhang, Jie Lu

We consider two cases of this setting, one is that the source domain only contains complementary-label data (completely complementary unsupervised domain adaptation, CC-UDA), and the other is that the source domain has plenty of complementary-label data and a small amount of true-label data (partly complementary unsupervised domain adaptation, PC-UDA).

Unsupervised Domain Adaptation

Clarinet: A One-step Approach Towards Budget-friendly Unsupervised Domain Adaptation

1 code implementation29 Jul 2020 Yiyang Zhang, Feng Liu, Zhen Fang, Bo Yuan, Guangquan Zhang, Jie Lu

To mitigate this problem, we consider a novel problem setting where the classifier for the target domain has to be trained with complementary-label data from the source domain and unlabeled data from the target domain named budget-friendly UDA (BFUDA).

Unsupervised Domain Adaptation

Bridging the Theoretical Bound and Deep Algorithms for Open Set Domain Adaptation

no code implementations23 Jun 2020 Li Zhong, Zhen Fang, Feng Liu, Bo Yuan, Guangquan Zhang, Jie Lu

To achieve this aim, a previous study has proven an upper bound of the target-domain risk, and the open set difference, as an important term in the upper bound, is used to measure the risk on unknown target data.

Domain Adaptation Object Recognition

Diverse Instances-Weighting Ensemble based on Region Drift Disagreement for Concept Drift Adaptation

no code implementations13 Apr 2020 Anjin Liu, Jie Lu, Guangquan Zhang

Concept drift refers to changes in the distribution of underlying data and is an inherent property of evolving data streams.

Ensemble Learning

Learning under Concept Drift: A Review

no code implementations13 Apr 2020 Jie Lu, Anjin Liu, Fan Dong, Feng Gu, Joao Gama, Guangquan Zhang

To help researchers identify which research topics are significant and how to apply related techniques in data analysis tasks, it is necessary that a high quality, instructive review of current research developments and trends in the concept drift field is conducted.

Wildly Unsupervised Domain Adaptation and Its Powerful and Efficient Solution

no code implementations25 Sep 2019 Feng Liu, Jie Lu, Bo Han, Gang Niu, Guangquan Zhang, Masashi Sugiyama

Hence, we consider a new, more realistic and more challenging problem setting, where classifiers have to be trained with noisy labeled data from SD and unlabeled data from TD---we name it wildly UDA (WUDA).

Unsupervised Domain Adaptation Wildly Unsupervised Domain Adaptation

Cross-domain Network Representations

no code implementations1 Aug 2019 Shan Xue, Jie Lu, Guangquan Zhang

By generating the random walks from a structural rich domain and transferring the knowledge on the random walks across domains, it enables a network representation for the structural scarce domain as well.

Transfer Learning

Open Set Domain Adaptation: Theoretical Bound and Algorithm

1 code implementation19 Jul 2019 Zhen Fang, Jie Lu, Feng Liu, Junyu Xuan, Guangquan Zhang

The aim of unsupervised domain adaptation is to leverage the knowledge in a labeled (source) domain to improve a model's learning performance with an unlabeled (target) domain -- the basic strategy being to mitigate the effects of discrepancies between the two distributions.

Unsupervised Domain Adaptation

Butterfly: One-step Approach towards Wildly Unsupervised Domain Adaptation

1 code implementation19 May 2019 Feng Liu, Jie Lu, Bo Han, Gang Niu, Guangquan Zhang, Masashi Sugiyama

Hence, we consider a new, more realistic and more challenging problem setting, where classifiers have to be trained with noisy labeled data from SD and unlabeled data from TD -- we name it wildly UDA (WUDA).

Unsupervised Domain Adaptation Wildly Unsupervised Domain Adaptation

Deep Uncertainty Quantification: A Machine Learning Approach for Weather Forecasting

3 code implementations22 Dec 2018 Bin Wang, Jie Lu, Zheng Yan, Huaishao Luo, Tianrui Li, Yu Zheng, Guangquan Zhang

We cast the weather forecasting problem as an end-to-end deep learning problem and solve it by proposing a novel negative log-likelihood error (NLE) loss function.

BIG-bench Machine Learning Uncertainty Quantification +1

Cooperative Hierarchical Dirichlet Processes: Superposition vs. Maximization

no code implementations18 Jul 2017 Junyu Xuan, Jie Lu, Guangquan Zhang, Richard Yi Da Xu

The cooperative hierarchical structure is a common and significant data structure observed in, or adopted by, many research areas, such as: text mining (author-paper-word) and multi-label classification (label-instance-feature).

Multi-Label Classification Topic Models

Dependent Indian Buffet Process-based Sparse Nonparametric Nonnegative Matrix Factorization

no code implementations12 Jul 2015 Junyu Xuan, Jie Lu, Guangquan Zhang, Richard Yi Da Xu, Xiangfeng Luo

Under this same framework, two classes of correlation function are proposed (1) using Bivariate beta distribution and (2) using Copula function.

Clustering Recommendation Systems

Infinite Author Topic Model based on Mixed Gamma-Negative Binomial Process

no code implementations30 Mar 2015 Junyu Xuan, Jie Lu, Guangquan Zhang, Richard Yi Da Xu, Xiangfeng Luo

One branch of these works is the so-called Author Topic Model (ATM), which incorporates the authors's interests as side information into the classical topic model.

Information Retrieval Retrieval

Cannot find the paper you are looking for? You can Submit a new open access paper.