Search Results for author: Xialei Liu

Found 34 papers, 29 papers with code

Strip R-CNN: Large Strip Convolution for Remote Sensing Object Detection

3 code implementations7 Jan 2025 Xinbin Yuan, Zhaohui Zheng, YuXuan Li, Xialei Liu, Li Liu, Xiang Li, Qibin Hou, Ming-Ming Cheng

While witnessed with rapid development, remote sensing object detection remains challenging for detecting high aspect ratio objects.

 Ranked #1 on Object Detection In Aerial Images on DOTA (using extra training data)

Object object-detection +2

Early Preparation Pays Off: New Classifier Pre-tuning for Class Incremental Semantic Segmentation

1 code implementation19 Jul 2024 Zhengyuan Xie, Haiquan Lu, Jia-Wen Xiao, Enguang Wang, Le Zhang, Xialei Liu

In this paper, we propose a new classifier pre-tuning~(NeST) method applied before the formal training process, learning a transformation from old classifiers to generate new classifiers for initialization rather than directly tuning the parameters of new classifiers.

Class-Incremental Semantic Segmentation

Class-Incremental Learning with CLIP: Adaptive Representation Adjustment and Parameter Fusion

1 code implementation19 Jul 2024 Linlan Huang, Xusheng Cao, Haori Lu, Xialei Liu

Most existing works with pre-trained models assume that the forgetting of old classes is uniform when the model acquires new knowledge.

class-incremental learning Class Incremental Learning +1

Generative Multi-modal Models are Good Class-Incremental Learners

1 code implementation27 Mar 2024 Xusheng Cao, Haori Lu, Linlan Huang, Xialei Liu, Ming-Ming Cheng

In class-incremental learning (CIL) scenarios, the phenomenon of catastrophic forgetting caused by the classifier's bias towards the current task has long posed a significant challenge.

class-incremental learning Class Incremental Learning +1

Unlocking the Multi-modal Potential of CLIP for Generalized Category Discovery

1 code implementation15 Mar 2024 Enguang Wang, Zhimao Peng, Zhengyuan Xie, Fei Yang, Xialei Liu, Ming-Ming Cheng

Specifically, our TES leverages the property that CLIP can generate aligned vision-language features, converting visual embeddings into tokens of the CLIP's text encoder to generate pseudo text embeddings.

Generative Multi-modal Models are Good Class Incremental Learners

1 code implementation CVPR 2024 Xusheng Cao, Haori Lu, Linlan Huang, Xialei Liu, Ming-Ming Cheng

In class incremental learning (CIL) scenarios the phenomenon of catastrophic forgetting caused by the classifier's bias towards the current task has long posed a significant challenge.

class-incremental learning Class Incremental Learning +1

Class Incremental Learning with Pre-trained Vision-Language Models

no code implementations31 Oct 2023 Xialei Liu, Xusheng Cao, Haori Lu, Jia-Wen Xiao, Andrew D. Bagdanov, Ming-Ming Cheng

We also propose a method for parameter retention in the adapter layers that uses a measure of parameter importance to better maintain stability and plasticity during incremental learning.

class-incremental learning Class Incremental Learning +2

Masked Autoencoders are Efficient Class Incremental Learners

1 code implementation ICCV 2023 Jiang-Tian Zhai, Xialei Liu, Andrew D. Bagdanov, Ke Li, Ming-Ming Cheng

Moreover, MAEs can reliably reconstruct original input images from randomly selected patches, which we use to store exemplars from past tasks more efficiently for CIL.

class-incremental learning Class Incremental Learning +1

Make Explicit Calibration Implicit: Calibrate Denoiser Instead of the Noise Model

1 code implementation ICCV 2023 Xin Jin, Jia-Wen Xiao, Ling-Hao Han, Chunle Guo, Xialei Liu, Chongyi Li, Ming-Ming Cheng

However, these methods are impeded by several critical limitations: a) the explicit calibration process is both labor- and time-intensive, b) challenge exists in transferring denoisers across different camera models, and c) the disparity between synthetic and real noise is exacerbated by digital gain.

Image Denoising

Endpoints Weight Fusion for Class Incremental Semantic Segmentation

no code implementations CVPR 2023 Jia-Wen Xiao, Chang-Bin Zhang, Jiekang Feng, Xialei Liu, Joost Van de Weijer, Ming-Ming Cheng

In our method, the model containing old knowledge is fused with the model retaining new knowledge in a dynamic fusion manner, strengthening the memory of old classes in ever-changing distributions.

class-incremental learning Class-Incremental Semantic Segmentation +2

Task-Adaptive Saliency Guidance for Exemplar-free Class Incremental Learning

1 code implementation CVPR 2024 Xialei Liu, Jiang-Tian Zhai, Andrew D. Bagdanov, Ke Li, Ming-Ming Cheng

EFCIL is of interest because it mitigates concerns about privacy and long-term storage of data, while at the same time alleviating the problem of catastrophic forgetting in incremental learning.

class-incremental learning Class Incremental Learning +1

Long-Tailed Class Incremental Learning

1 code implementation1 Oct 2022 Xialei Liu, Yu-Song Hu, Xu-Sheng Cao, Andrew D. Bagdanov, Ke Li, Ming-Ming Cheng

However, conventional CIL methods consider a balanced distribution for each new task, which ignores the prevalence of long-tailed distributions in the real world.

class-incremental learning Class Incremental Learning +1

Universal Representations: A Unified Look at Multiple Task and Domain Learning

2 code implementations6 Apr 2022 Wei-Hong Li, Xialei Liu, Hakan Bilen

We propose a unified look at jointly learning multiple vision tasks and visual domains through universal representations, a single deep neural network.

cross-domain few-shot learning Image Classification

Representation Compensation Networks for Continual Semantic Segmentation

1 code implementation CVPR 2022 Chang-Bin Zhang, Jia-Wen Xiao, Xialei Liu, Ying-Cong Chen, Ming-Ming Cheng

In this work, we study the continual semantic segmentation problem, where the deep neural networks are required to incorporate new classes continually without catastrophic forgetting.

Class Incremental Learning Continual Semantic Segmentation +16

Learning Multiple Dense Prediction Tasks from Partially Annotated Data

1 code implementation CVPR 2022 Wei-Hong Li, Xialei Liu, Hakan Bilen

Despite the recent advances in multi-task learning of dense prediction problems, most methods rely on expensive labelled datasets.

Multi-Task Learning

Incremental Meta-Learning via Episodic Replay Distillation for Few-Shot Image Recognition

1 code implementation9 Nov 2021 Kai Wang, Xialei Liu, Andy Bagdanov, Luis Herranz, Shangling Jui, Joost Van de Weijer

We propose an approach to IML, which we call Episodic Replay Distillation (ERD), that mixes classes from the current task with class exemplars from previous tasks when sampling episodes for meta-learning.

Continual Learning Knowledge Distillation +1

HCV: Hierarchy-Consistency Verification for Incremental Implicitly-Refined Classification

1 code implementation21 Oct 2021 Kai Wang, Xialei Liu, Luis Herranz, Joost Van de Weijer

To overcome forgetting in this benchmark, we propose Hierarchy-Consistency Verification (HCV) as an enhancement to existing continual learning methods.

Classification Continual Learning +1

Cross-domain Few-shot Learning with Task-specific Adapters

4 code implementations CVPR 2022 Wei-Hong Li, Xialei Liu, Hakan Bilen

In this paper, we look at the problem of cross-domain few-shot classification that aims to learn a classifier from previously unseen classes and domains with few labeled samples.

cross-domain few-shot learning Few-Shot Image Classification

Universal Representation Learning from Multiple Domains for Few-shot Classification

5 code implementations ICCV 2021 Wei-Hong Li, Xialei Liu, Hakan Bilen

In this paper, we look at the problem of few-shot classification that aims to learn a classifier for previously unseen classes and domains from few labeled samples.

Classification Few-Shot Image Classification +2

Class-incremental learning: survey and performance evaluation on image classification

1 code implementation28 Oct 2020 Marc Masana, Xialei Liu, Bartlomiej Twardowski, Mikel Menta, Andrew D. Bagdanov, Joost Van de Weijer

For future learning systems, incremental learning is desirable because it allows for: efficient resource usage by eliminating the need to retrain from scratch at the arrival of new data; reduced memory usage by preventing or limiting the amount of data required to be stored -- also important when privacy limitations are imposed; and learning that more closely resembles human learning.

class-incremental learning Class Incremental Learning +4

Learning to Rank for Active Learning: A Listwise Approach

no code implementations31 Jul 2020 Minghan Li, Xialei Liu, Joost Van de Weijer, Bogdan Raducanu

Active learning emerged as an alternative to alleviate the effort to label huge amount of data for data hungry applications (such as image/video indexing and retrieval, autonomous driving, etc.).

Active Learning Autonomous Driving +3

Semantic Drift Compensation for Class-Incremental Learning

2 code implementations CVPR 2020 Lu Yu, Bartłomiej Twardowski, Xialei Liu, Luis Herranz, Kai Wang, Yongmei Cheng, Shangling Jui, Joost Van de Weijer

The vast majority of methods have studied this scenario for classification networks, where for each new task the classification layer of the network must be augmented with additional weights to make room for the newly added classes.

class-incremental learning Class Incremental Learning +2

Multi-Task Incremental Learning for Object Detection

no code implementations13 Feb 2020 Xialei Liu, Hao Yang, Avinash Ravichandran, Rahul Bhotika, Stefano Soatto

For the difficult cases, where the domain gaps and especially category differences are large, we explore three different exemplar sampling methods and show the proposed adaptive sampling method is effective to select diverse and informative samples from entire datasets, to further prevent forgetting.

Incremental Learning Object +2

Exploiting Unlabeled Data in CNNs by Self-supervised Learning to Rank

2 code implementations17 Feb 2019 Xialei Liu, Joost Van de Weijer, Andrew D. Bagdanov

Our results show that networks trained to regress to the ground truth targets for labeled data and to simultaneously learn to rank unlabeled data obtain significantly better, state-of-the-art results for both IQA and crowd counting.

Active Learning Crowd Counting +5

Memory Replay GANs: Learning to Generate New Categories without Forgetting

1 code implementation NeurIPS 2018 Chenshen Wu, Luis Herranz, Xialei Liu, Yaxing Wang, Joost Van de Weijer, Bogdan Raducanu

In particular, we investigate generative adversarial networks (GANs) in the task of learning new categories in a sequential fashion.

Memory Replay GANs: learning to generate images from new categories without forgetting

2 code implementations6 Sep 2018 Chenshen Wu, Luis Herranz, Xialei Liu, Yaxing Wang, Joost Van de Weijer, Bogdan Raducanu

In particular, we investigate generative adversarial networks (GANs) in the task of learning new categories in a sequential fashion.

Leveraging Unlabeled Data for Crowd Counting by Learning to Rank

1 code implementation CVPR 2018 Xialei Liu, Joost Van de Weijer, Andrew D. Bagdanov

We propose a novel crowd counting approach that leverages abundantly available unlabeled crowd imagery in a learning-to-rank framework.

Crowd Counting Image Retrieval +2

Rotate your Networks: Better Weight Consolidation and Less Catastrophic Forgetting

2 code implementations8 Feb 2018 Xialei Liu, Marc Masana, Luis Herranz, Joost Van de Weijer, Antonio M. Lopez, Andrew D. Bagdanov

In this paper we propose an approach to avoiding catastrophic forgetting in sequential task learning scenarios.

RankIQA: Learning from Rankings for No-reference Image Quality Assessment

2 code implementations ICCV 2017 Xialei Liu, Joost Van de Weijer, Andrew D. Bagdanov

Furthermore, on the LIVE benchmark we show that our approach is superior to existing NR-IQA techniques and that we even outperform the state-of-the-art in full-reference IQA (FR-IQA) methods without having to resort to high-quality reference images to infer IQA.

Cannot find the paper you are looking for? You can Submit a new open access paper.