Search Results for author: Shengqin Jiang

Found 4 papers, 2 papers with code

Teacher Agent: A Knowledge Distillation-Free Framework for Rehearsal-based Video Incremental Learning

1 code implementation1 Jun 2023 Shengqin Jiang, Yaoyu Fang, Haokui Zhang, Qingshan Liu, Yuankai Qi, Yang Yang, Peng Wang

Rehearsal-based video incremental learning often employs knowledge distillation to mitigate catastrophic forgetting of previously learned data.

Incremental Learning Knowledge Distillation +1

Crowd Counting with Online Knowledge Learning

no code implementations18 Mar 2023 Shengqin Jiang, Bowen Li, Fengna Cheng, Qingshan Liu

Moreover, we propose a feature relation distillation method which allows the student branch to more effectively comprehend the evolution of inter-layer features by constructing a new inter-layer relationship matrix.

Crowd Counting Edge-computing +1

A Unified Object Counting Network with Object Occupation Prior

1 code implementation29 Dec 2022 Shengqin Jiang, Qing Wang, Fengna Cheng, Yuankai Qi, Qingshan Liu

In this paper, we build the first evolving object counting dataset and propose a unified object counting network as the first attempt to address this task.

Crowd Counting Knowledge Distillation +2

Mask-aware networks for crowd counting

no code implementations18 Dec 2018 Shengqin Jiang, Xiaobo Lu, Yinjie Lei, Lingqiao Liu

Our rationale is that the mask prediction could be better modeled as a binary segmentation problem and the difficulty of estimating the density could be reduced if the mask is known.

Crowd Counting Object

Cannot find the paper you are looking for? You can Submit a new open access paper.