Search Results for author: Shibo Jie

Found 6 papers, 4 papers with code

Revisiting the Parameter Efficiency of Adapters from the Perspective of Precision Redundancy

1 code implementation ICCV 2023 Shibo Jie, Haoqing Wang, Zhi-Hong Deng

Current state-of-the-art results in computer vision depend in part on fine-tuning large pre-trained vision models.

Quantization

Detachedly Learn a Classifier for Class-Incremental Learning

no code implementations23 Feb 2023 Ziheng Li, Shibo Jie, Zhi-Hong Deng

In continual learning, model needs to continually learn a feature extractor and classifier on a sequence of tasks.

Class Incremental Learning Incremental Learning

FacT: Factor-Tuning for Lightweight Adaptation on Vision Transformer

1 code implementation6 Dec 2022 Shibo Jie, Zhi-Hong Deng

Recent work has explored the potential to adapt a pre-trained vision transformer (ViT) by updating only a few parameters so as to improve storage efficiency, called parameter-efficient transfer learning (PETL).

8k Transfer Learning

Bypassing Logits Bias in Online Class-Incremental Learning with a Generative Framework

no code implementations19 May 2022 Gehui Shen, Shibo Jie, Ziheng Li, Zhi-Hong Deng

In our framework, a generative classifier which utilizes replay memory is used for inference, and the training objective is a pair-based metric learning loss which is proven theoretically to optimize the feature space in a generative way.

Class Incremental Learning Incremental Learning +1

Alleviating Representational Shift for Continual Fine-tuning

1 code implementation22 Apr 2022 Shibo Jie, Zhi-Hong Deng, Ziheng Li

We study a practical setting of continual learning: fine-tuning on a pre-trained model continually.

Continual Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.