Search Results for author: Chenshu Chen

Found 1 papers, 0 papers with code

Normalized Feature Distillation for Semantic Segmentation

no code implementations12 Jul 2022 Tao Liu, Xi Yang, Chenshu Chen

As a promising approach in model compression, knowledge distillation improves the performance of a compact model by transferring the knowledge from a cumbersome one.

Knowledge Distillation Model Compression +2

Cannot find the paper you are looking for? You can Submit a new open access paper.