no code implementations • 16 Jan 2025 • Xianghu Yue, Yiming Chen, Xueyi Zhang, Xiaoxue Gao, Mengling Feng, Mingrui Lao, Huiping Zhuang, Haizhou Li
Concretely, we devise modality-specific prompts to compensate for missing information, facilitating the model to maintain a holistic representation of the data.
1 code implementation • 14 Dec 2024 • Jiaxu Li, Songning Lai, Rui Li, Di Fang, Kejia Fan, Jianheng Tang, Yuhan Zhao, Rongchang Zhao, Dongzhan Zhou, Yutao Yue, Huiping Zhuang
Extensive experiments on the Pascal VOC2012 dataset show that SegACIL achieves superior performance in the sequential, disjoint, and overlap settings, offering a robust solution to the challenges of class-incremental semantic segmentation.
no code implementations • 29 Oct 2024 • Yufei Zhang, Yicheng Xu, Hongxin Wei, Zhiping Lin, Huiping Zhuang
We innovatively introduce analytic learning into TTA, using the Analytic Classifiers (ACs) to prevent model forgetting.
1 code implementation • 21 Oct 2024 • Kejia Fan, Jiaxu Li, Songning Lai, Linpu Lv, Anfeng Liu, Jianheng Tang, Houbing Herbert Song, Yutao Yue, Huiping Zhuang
A primary challenge in this problem is catastrophic forgetting, where the incorporation of new data samples causes the models to forget previously learned information.
1 code implementation • 12 Oct 2024 • Kangdao Liu, Hao Zeng, Jianguo Huang, Huiping Zhuang, Chi-Man Vong, Hongxin Wei
Conformal prediction, as an emerging uncertainty quantification technique, typically functions as post-hoc processing for the outputs of trained classifiers.
no code implementations • 18 Sep 2024 • Yi Yang, Lei Zhong, Huiping Zhuang
We introduce a novel Recursive Fusion model, dubbed ReFu, designed to integrate point clouds and meshes for exemplar-free 3D Class-Incremental Learning, where the model learns new 3D classes while retaining knowledge of previously learned ones.
no code implementations • 12 Sep 2024 • Rongzihan Song, Zhenyu Weng, Huiping Zhuang, Jinchang Ren, Yongming Chen, Zhiping Lin
Specifically, we develop the feature adaptive continual-learning (FAC) module, a neural network that can be trained online to learn features adaptively using all past tracking information during tracking.
2 code implementations • 19 Aug 2024 • Di Fang, Yinan Zhu, Runze Fang, Cen Chen, Ziqian Zeng, Huiping Zhuang
To solve this problem, we propose an analytic imbalance rectifier algorithm (AIR), a novel online exemplar-free continual learning method with an analytic (i. e., closed-form) solution for data-imbalanced class-incremental learning (CIL) and generalized CIL scenarios in real-world continual learning.
1 code implementation • 27 Jun 2024 • Yicheng Xu, Yuxin Chen, Jiahao Nie, Yusong Wang, Huiping Zhuang, Manabu Okumura
In this setting, a CL learner is required to incrementally learn from multiple domains and classify test images from both seen and unseen domains without any domain-identity hint.
1 code implementation • 20 Jun 2024 • Tao Zhang, Ziqian Zeng, Yuxiang Xiao, Huiping Zhuang, Cen Chen, James Foulds, SHimei Pan
Furthermore, we categorized the gender biases in the "rejected" responses of GenderAlign into 4 principal categories.
no code implementations • 3 Jun 2024 • Ziqian Zeng, Jianwei Wang, Junyao Yang, Zhengdong Lu, Huiping Zhuang, Cen Chen
The widespread usage of online Large Language Models (LLMs) inference services has raised significant privacy concerns about the potential exposure of private information in user inputs to malicious eavesdroppers.
1 code implementation • 28 May 2024 • Huiping Zhuang, Di Fang, Kai Tong, Yuchen Liu, Ziqian Zeng, Xu Zhou, Cen Chen
One of these scenarios can be formulated as an online continual learning (OCL) problem.
1 code implementation • 25 May 2024 • Huiping Zhuang, Run He, Kai Tong, Di Fang, Han Sun, Haoran Li, Tianyi Chen, Ziqian Zeng
In this paper, we introduce analytic federated learning (AFL), a new training paradigm that brings analytical (i. e., closed-form) solutions to the federated learning (FL) community.
1 code implementation • 8 Apr 2024 • Weikai Lu, Ziqian Zeng, Jianwei Wang, Zhengdong Lu, Zelin Chen, Huiping Zhuang, Cen Chen
Jailbreaking attacks can enable Large Language Models (LLMs) to bypass the safeguard and generate harmful content.
1 code implementation • 26 Mar 2024 • Huiping Zhuang, Run He, Kai Tong, Ziqian Zeng, Cen Chen, Zhiping Lin
The compensation stream is governed by a Dual-Activation Compensation (DAC) module.
1 code implementation • 23 Mar 2024 • Huiping Zhuang, Yuchen Liu, Run He, Kai Tong, Ziqian Zeng, Cen Chen, Yi Wang, Lap-Pui Chau
In this paper, we propose an exemplar-free approach--Forward-only Online Analytic Learning (F-OAL).
2 code implementations • 23 Mar 2024 • Huiping Zhuang, Yizhu Chen, Di Fang, Run He, Kai Tong, Hongxin Wei, Ziqian Zeng, Cen Chen
The GACL adopts analytic learning (a gradient-free training technique) and delivers an analytical (i. e., closed-form) solution to the GCIL scenario.
no code implementations • 20 Mar 2024 • Run He, Huiping Zhuang, Di Fang, Yizhu Chen, Kai Tong, Cen Chen
The DS-BPT pretrains model in streams of both supervised learning and self-supervised contrastive learning (SSCL) for base knowledge extraction.
no code implementations • 24 Feb 2024 • Ziqian Zeng, Jiahong Yu, Qianshi Pang, ZiHao Wang, Huiping Zhuang, HongEn Shao, Xiaofeng Zou
Within this framework, we introduce a lightweight draft model that effectively utilizes previously generated tokens to predict subsequent words.
1 code implementation • 8 Feb 2024 • Zhenlong Liu, Lei Feng, Huiping Zhuang, Xiaofeng Cao, Hongxin Wei
In this work, we propose a novel method -- Convex-Concave Loss, which enables a high variance of training loss distribution by gradient descent.
1 code implementation • 19 Dec 2023 • Ziqian Zeng, Yihuai Hong, Hongliang Dai, Huiping Zhuang, Cen Chen
We propose ConsistentEE, an early exiting method that is consistent in training and inference.
1 code implementation • CVPR 2023 • Huiping Zhuang, Zhenyu Weng, Run He, Zhiping Lin, Ziqian Zeng
In this paper, we approach the FSCIL by adopting analytic learning, a technique that converts network training into linear problems.
class-incremental learning Few-Shot Class-Incremental Learning +2
no code implementations • 8 Dec 2022 • Hongxin Wei, Huiping Zhuang, Renchunzi Xie, Lei Feng, Gang Niu, Bo An, Yixuan Li
In the presence of noisy labels, designing robust loss functions is critical for securing the generalization performance of deep neural networks.
1 code implementation • 30 May 2022 • Huiping Zhuang, Zhenyu Weng, Hongxin Wei, Renchunzi Xie, Kar-Ann Toh, Zhiping Lin
Class-incremental learning (CIL) learns a classification model with training data of different classes arising progressively.
no code implementations • 14 Feb 2022 • Huiping Zhuang, Zhiping Lin, Yimin Yang, Kar-Ann Toh
Training convolutional neural networks (CNNs) with back-propagation (BP) is time-consuming and resource-intensive particularly in view of the need to visit the dataset multiple times.
no code implementations • 3 Dec 2020 • Huiping Zhuang, Zhiping Lin, Kar-Ann Toh
Decoupled learning is a branch of model parallelism which parallelizes the training of a network by splitting it depth-wise into multiple modules.
1 code implementation • 21 Jun 2019 • Huiping Zhuang, Yi Wang, Qinglai Liu, Shuai Zhang, Zhiping Lin
Training neural networks with back-propagation (BP) requires a sequential passing of activations and gradients, which forces the network modules to work in a synchronous fashion.