Search Results for author: Jiangchao Yao

Found 42 papers, 25 papers with code

UniBrain: Universal Brain MRI Diagnosis with Hierarchical Knowledge-enhanced Pre-training

1 code implementation13 Sep 2023 Jiayu Lei, Lisong Dai, Haoyun Jiang, Chaoyi Wu, Xiaoman Zhang, Yao Zhang, Jiangchao Yao, Weidi Xie, Yanyong Zhang, Yuehua Li, Ya zhang, Yanfeng Wang

Magnetic resonance imaging~(MRI) have played a crucial role in brain disease diagnosis, with which a range of computer-aided artificial intelligence methods have been proposed.

Bag of Tricks for Long-Tailed Multi-Label Classification on Chest X-Rays

no code implementations17 Aug 2023 Feng Hong, Tianjie Dai, Jiangchao Yao, Ya zhang, Yanfeng Wang

Clinical classification of chest radiography is particularly challenging for standard machine learning algorithms due to its inherent long-tailed and multi-label nature.

Data Augmentation Multi-Label Classification

Balanced Destruction-Reconstruction Dynamics for Memory-replay Class Incremental Learning

1 code implementation3 Aug 2023 YuHang Zhou, Jiangchao Yao, Feng Hong, Ya zhang, Yanfeng Wang

By dynamically manipulating the gradient during training based on these factors, BDR can effectively alleviate knowledge destruction and improve knowledge reconstruction.

class-incremental learning Class Incremental Learning +1

On Strengthening and Defending Graph Reconstruction Attack with Markov Chain Approximation

1 code implementation15 Jun 2023 Zhanke Zhou, Chenyu Zhou, Xuan Li, Jiangchao Yao, Quanming Yao, Bo Han

Although powerful graph neural networks (GNNs) have boosted numerous real-world applications, the potential privacy risk is still underexplored.

Graph Reconstruction Reconstruction Attack

Zero-shot Composed Text-Image Retrieval

1 code implementation12 Jun 2023 Yikun Liu, Jiangchao Yao, Ya zhang, Yanfeng Wang, Weidi Xie

In this paper, we consider the problem of composed image retrieval (CIR), it aims to train a model that can fuse multi-modal information, e. g., text and images, to accurately retrieve images that match the query, extending the user's expression ability.

Image Retrieval Retrieval +1

Unleashing Mask: Explore the Intrinsic Out-of-Distribution Detection Capability

1 code implementation6 Jun 2023 Jianing Zhu, Hengzhuang Li, Jiangchao Yao, Tongliang Liu, Jianliang Xu, Bo Han

Based on such insights, we propose a novel method, Unleashing Mask, which aims to restore the OOD discriminative capabilities of the well-trained model with ID data.

Out-of-Distribution Detection Out of Distribution (OOD) Detection

Exploring Model Dynamics for Accumulative Poisoning Discovery

1 code implementation6 Jun 2023 Jianing Zhu, Xiawei Guo, Jiangchao Yao, Chao Du, Li He, Shuo Yuan, Tongliang Liu, Liang Wang, Bo Han

In this paper, we dive into the perspective of model dynamics and propose a novel information measure, namely, Memorization Discrepancy, to explore the defense via the model-level information.


Class-Balancing Diffusion Models

1 code implementation CVPR 2023 Yiming Qin, Huangjie Zheng, Jiangchao Yao, Mingyuan Zhou, Ya zhang

To tackle this problem, we set from the hypothesis that the data distribution is not class-balanced, and propose Class-Balancing Diffusion Models (CBDM) that are trained with a distribution adjustment regularizer as a solution.

Towards Efficient Task-Driven Model Reprogramming with Foundation Models

no code implementations5 Apr 2023 Shoukai Xu, Jiangchao Yao, Ran Luo, Shuhai Zhang, Zihao Lian, Mingkui Tan, Bo Han, YaoWei Wang

Moreover, the data used for pretraining foundation models are usually invisible and very different from the target data of downstream tasks.

Knowledge Distillation Transfer Learning

Combating Exacerbated Heterogeneity for Robust Models in Federated Learning

1 code implementation1 Mar 2023 Jianing Zhu, Jiangchao Yao, Tongliang Liu, Quanming Yao, Jianliang Xu, Bo Han

Privacy and security concerns in real-world applications have led to the development of adversarially robust federated models.

Federated Learning

Latent Class-Conditional Noise Model

1 code implementation19 Feb 2023 Jiangchao Yao, Bo Han, Zhihan Zhou, Ya zhang, Ivor W. Tsang

We solve this problem by introducing a Latent Class-Conditional Noise model (LCCN) to parameterize the noise transition under a Bayesian framework.

Learning with noisy labels

Long-Tailed Partial Label Learning via Dynamic Rebalancing

1 code implementation10 Feb 2023 Feng Hong, Jiangchao Yao, Zhihan Zhou, Ya zhang, Yanfeng Wang

The straightforward combination of LT and PLL, i. e., LT-PLL, suffers from a fundamental dilemma: LT methods build upon a given class distribution that is unavailable in PLL, and the performance of PLL is severely influenced in long-tailed context.

Partial Label Learning

Federated Domain Generalization With Generalization Adjustment

1 code implementation CVPR 2023 Ruipeng Zhang, Qinwei Xu, Jiangchao Yao, Ya zhang, Qi Tian, Yanfeng Wang

Federated Domain Generalization (FedDG) attempts to learn a global model in a privacy-preserving manner that generalizes well to new clients possibly with domain shift.

Domain Generalization Fairness +1

FedSkip: Combatting Statistical Heterogeneity with Federated Skip Aggregation

1 code implementation14 Dec 2022 Ziqing Fan, Yanfeng Wang, Jiangchao Yao, Lingjuan Lyu, Ya zhang, Qi Tian

However, in addition to previous explorations for improvement in federated averaging, our analysis shows that another critical bottleneck is the poorer optima of client models in more heterogeneous conditions.

Federated Learning

NAS-LID: Efficient Neural Architecture Search with Local Intrinsic Dimension

1 code implementation23 Nov 2022 Xin He, Jiangchao Yao, Yuxin Wang, Zhenheng Tang, Ka Chu Cheung, Simon See, Bo Han, Xiaowen Chu

One-shot neural architecture search (NAS) substantially improves the search efficiency by training one supernet to estimate the performance of every possible child architecture (i. e., subnet).

Neural Architecture Search

Device-Cloud Collaborative Recommendation via Meta Controller

no code implementations7 Jul 2022 Jiangchao Yao, Feng Wang, Xichen Ding, Shaohu Chen, Bo Han, Jingren Zhou, Hongxia Yang

To overcome this issue, we propose a meta controller to dynamically manage the collaboration between the on-device recommender and the cloud-based recommender, and introduce a novel efficient sample construction from the causal perspective to solve the dataset absence issue of meta controller.

Device-Cloud Collaboration

Contrastive Learning with Boosted Memorization

1 code implementation25 May 2022 Zhihan Zhou, Jiangchao Yao, Yanfeng Wang, Bo Han, Ya zhang

Different from previous works, we explore this direction from an alternative perspective, i. e., the data perspective, and propose a novel Boosted Contrastive Learning (BCL) method.

Contrastive Learning Memorization +2

Edge-Cloud Polarization and Collaboration: A Comprehensive Survey for AI

1 code implementation11 Nov 2021 Jiangchao Yao, Shengyu Zhang, Yang Yao, Feng Wang, Jianxin Ma, Jianwei Zhang, Yunfei Chu, Luo Ji, Kunyang Jia, Tao Shen, Anpeng Wu, Fengda Zhang, Ziqi Tan, Kun Kuang, Chao Wu, Fei Wu, Jingren Zhou, Hongxia Yang

However, edge computing, especially edge and cloud collaborative computing, are still in its infancy to announce their success due to the resource-constrained IoT scenarios with very limited algorithms deployed.

Cloud Computing Edge-computing

$\alpha$-Weighted Federated Adversarial Training

no code implementations29 Sep 2021 Jianing Zhu, Jiangchao Yao, Tongliang Liu, Kunyang Jia, Jingren Zhou, Bo Han, Hongxia Yang

Federated Adversarial Training (FAT) helps us address the data privacy and governance issues, meanwhile maintains the model robustness to the adversarial attack.

Adversarial Attack Federated Learning

MC$^2$-SF: Slow-Fast Learning for Mobile-Cloud Collaborative Recommendation

no code implementations25 Sep 2021 Zeyuan Chen, Jiangchao Yao, Feng Wang, Kunyang Jia, Bo Han, Wei zhang, Hongxia Yang

With the hardware development of mobile devices, it is possible to build the recommendation models on the mobile side to utilize the fine-grained features and the real-time feedbacks.

Cooperative Learning for Noisy Supervision

no code implementations11 Aug 2021 Hao Wu, Jiangchao Yao, Ya zhang, Yanfeng Wang

Learning with noisy labels has gained the enormous interest in the robust deep learning area.

Learning with noisy labels

Reliable Adversarial Distillation with Unreliable Teachers

2 code implementations ICLR 2022 Jianing Zhu, Jiangchao Yao, Bo Han, Jingfeng Zhang, Tongliang Liu, Gang Niu, Jingren Zhou, Jianliang Xu, Hongxia Yang

However, when considering adversarial robustness, teachers may become unreliable and adversarial distillation may not work: teachers are pretrained on their own adversarial data, and it is too demanding to require that teachers are also good at every adversarial data queried by students.

Adversarial Robustness

Contrastive Attraction and Contrastive Repulsion for Representation Learning

1 code implementation8 May 2021 Huangjie Zheng, Xu Chen, Jiangchao Yao, Hongxia Yang, Chunyuan Li, Ya zhang, Hao Zhang, Ivor Tsang, Jingren Zhou, Mingyuan Zhou

We realize this strategy with contrastive attraction and contrastive repulsion (CACR), which makes the query not only exert a greater force to attract more distant positive samples but also do so to repel closer negative samples.

Contrastive Learning Representation Learning

Device-Cloud Collaborative Learning for Recommendation

no code implementations14 Apr 2021 Jiangchao Yao, Feng Wang, Kunyang Jia, Bo Han, Jingren Zhou, Hongxia Yang

With the rapid development of storage and computing power on mobile devices, it becomes critical and popular to deploy models on devices to save onerous communication latencies and to capture real-time features.

Collaborative Label Correction via Entropy Thresholding

no code implementations31 Mar 2021 Hao Wu, Jiangchao Yao, Jiajie Wang, Yinru Chen, Ya zhang, Yanfeng Wang

Deep neural networks (DNNs) have the capacity to fit extremely noisy labels nonetheless they tend to learn data with clean labels first and then memorize those with noisy labels.

Learning with Group Noise

no code implementations17 Mar 2021 Qizhou Wang, Jiangchao Yao, Chen Gong, Tongliang Liu, Mingming Gong, Hongxia Yang, Bo Han

Most of the previous approaches in this area focus on the pairwise relation (casual or correlational relationship) with noise, such as learning with noisy labels.

Learning with noisy labels

Sparse-Interest Network for Sequential Recommendation

1 code implementation18 Feb 2021 Qiaoyu Tan, Jianwei Zhang, Jiangchao Yao, Ninghao Liu, Jingren Zhou, Hongxia Yang, Xia Hu

Our sparse-interest module can adaptively infer a sparse set of concepts for each user from the large concept pool and output multiple embeddings accordingly.

Sequential Recommendation

Learning on Attribute-Missing Graphs

3 code implementations3 Nov 2020 Xu Chen, Siheng Chen, Jiangchao Yao, Huangjie Zheng, Ya zhang, Ivor W Tsang

Thereby, designing a new GNN for these graphs is a burning issue to the graph learning community.

Graph Learning Link Prediction

Decoupled Variational Embedding for Signed Directed Networks

1 code implementation28 Aug 2020 Xu Chen, Jiangchao Yao, Maosen Li, Ya zhang, Yan-Feng Wang

Comprehensive results on both link sign prediction and node recommendation task demonstrate the effectiveness of DVE.

Link Sign Prediction Node Classification +1

Bayes EMbedding (BEM): Refining Representation by Integrating Knowledge Graphs and Behavior-specific Networks

1 code implementation28 Aug 2019 Yuting Ye, Xuwu Wang, Jiangchao Yao, Kunyang Jia, Jingren Zhou, Yanghua Xiao, Hongxia Yang

Low-dimensional embeddings of knowledge graphs and behavior graphs have proved remarkably powerful in varieties of tasks, from predicting unobserved edges between entities to content recommendation.

General Classification Knowledge Graph Embedding +3

Node Attribute Generation on Graphs

3 code implementations23 Jul 2019 Xu Chen, Siheng Chen, Huangjie Zheng, Jiangchao Yao, Kenan Cui, Ya zhang, Ivor W. Tsang

NANG learns a unifying latent representation which is shared by both node attributes and graph structures and can be translated to different modalities.

Data Augmentation General Classification +2

Safeguarded Dynamic Label Regression for Generalized Noisy Supervision

1 code implementation6 Mar 2019 Jiangchao Yao, Ya zhang, Ivor W. Tsang, Jun Sun

We further generalize LCCN for open-set noisy labels and the semi-supervised setting.

Ranked #33 on Image Classification on Clothing1M (using extra training data)

Learning with noisy labels regression

Pumpout: A Meta Approach for Robustly Training Deep Neural Networks with Noisy Labels

no code implementations27 Sep 2018 Bo Han, Gang Niu, Jiangchao Yao, Xingrui Yu, Miao Xu, Ivor Tsang, Masashi Sugiyama

To handle these issues, by using the memorization effects of deep neural networks, we may train deep neural networks on the whole dataset only the first few iterations.


Variational Collaborative Learning for User Probabilistic Representation

no code implementations22 Sep 2018 Kenan Cui, Xu Chen, Jiangchao Yao, Ya zhang

Conventional CF-based methods use the user-item interaction data as the sole information source to recommend items to users.

Collaborative Filtering Recommendation Systems

Understanding VAEs in Fisher-Shannon Plane

no code implementations10 Jul 2018 Huangjie Zheng, Jiangchao Yao, Ya zhang, Ivor W. Tsang, Jia Wang

In information theory, Fisher information and Shannon information (entropy) are respectively used to quantify the uncertainty associated with the distribution modeling and the uncertainty in specifying the outcome of given variables.

Representation Learning

Masking: A New Perspective of Noisy Supervision

2 code implementations NeurIPS 2018 Bo Han, Jiangchao Yao, Gang Niu, Mingyuan Zhou, Ivor Tsang, Ya zhang, Masashi Sugiyama

It is important to learn various types of classifiers given training data with noisy labels.

Ranked #39 on Image Classification on Clothing1M (using extra training data)

Image Classification

Variational Composite Autoencoders

no code implementations12 Apr 2018 Jiangchao Yao, Ivor Tsang, Ya zhang

Learning in the latent variable model is challenging in the presence of the complex data structure or the intractable latent variable.

Degeneration in VAE: in the Light of Fisher Information Loss

no code implementations19 Feb 2018 Huangjie Zheng, Jiangchao Yao, Ya zhang, Ivor W. Tsang

While enormous progress has been made to Variational Autoencoder (VAE) in recent years, similar to other deep networks, VAE with deep networks suffers from the problem of degeneration, which seriously weakens the correlation between the input and the corresponding latent codes, deviating from the goal of the representation learning.

Representation Learning

Collaborative Learning for Weakly Supervised Object Detection

no code implementations10 Feb 2018 Jiajie Wang, Jiangchao Yao, Ya zhang, Rui Zhang

For object detection, taking WSDDN-like architecture as weakly supervised detector sub-network and Faster-RCNN-like architecture as strongly supervised detector sub-network, we propose an end-to-end Weakly Supervised Collaborative Detection Network.

object-detection Weakly Supervised Object Detection

Deep Learning from Noisy Image Labels with Quality Embedding

no code implementations2 Nov 2017 Jiangchao Yao, Jiajie Wang, Ivor Tsang, Ya zhang, Jun Sun, Chengqi Zhang, Rui Zhang

However, the label noise among the datasets severely degenerates the \mbox{performance of deep} learning approaches.

Cannot find the paper you are looking for? You can Submit a new open access paper.