Search Results for author: Xinting Hu

Found 4 papers, 3 papers with code

Boundary and Relation Distillation for Semantic Segmentation

no code implementations24 Jan 2024 Dong Zhang, Pingcheng Dong, Xinting Hu, Long Chen, Kwang-Ting Cheng

Concurrently, the relation distillation transfers implicit relations from the teacher model to the student model using pixel-level self-relation as a bridge, ensuring that the student's mask has strong target region connectivity.

Implicit Relations Knowledge Distillation +2

On Non-Random Missing Labels in Semi-Supervised Learning

1 code implementation ICLR 2022 Xinting Hu, Yulei Niu, Chunyan Miao, Xian-Sheng Hua, Hanwang Zhang

Our method is three-fold: 1) We propose Class-Aware Propensity (CAP) that exploits the unlabeled data to train an improved classifier using the biased labeled data.

Imputation Missing Labels +1

Distilling Causal Effect of Data in Class-Incremental Learning

1 code implementation CVPR 2021 Xinting Hu, Kaihua Tang, Chunyan Miao, Xian-Sheng Hua, Hanwang Zhang

We propose a causal framework to explain the catastrophic forgetting in Class-Incremental Learning (CIL) and then derive a novel distillation method that is orthogonal to the existing anti-forgetting techniques, such as data replay and feature/label distillation.

Class Incremental Learning Incremental Learning

Learning to Segment the Tail

1 code implementation CVPR 2020 Xinting Hu, Yi Jiang, Kaihua Tang, Jingyuan Chen, Chunyan Miao, Hanwang Zhang

Real-world visual recognition requires handling the extreme sample imbalance in large-scale long-tailed data.

Few-Shot Learning Incremental Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.