Search Results for author: Dian Qin

Found 2 papers, 2 papers with code

Hilbert Distillation for Cross-Dimensionality Networks

1 code implementation8 Nov 2022 Dian Qin, Haishuai Wang, Zhe Liu, Hongjia Xu, Sheng Zhou, Jiajun Bu

Since the distilled 2D networks are supervised by the curves converted from dimensionally heterogeneous 3D features, the 2D networks are given an informative view in terms of learning structural information embedded in well-trained high-dimensional representations.

Efficient Medical Image Segmentation Based on Knowledge Distillation

1 code implementation23 Aug 2021 Dian Qin, Jiajun Bu, Zhe Liu, Xin Shen, Sheng Zhou, Jingjun Gu, Zhijua Wang, Lei Wu, Huifen Dai

To deal with this problem, we propose an efficient architecture by distilling knowledge from well-trained medical image segmentation networks to train another lightweight network.

Image Segmentation Knowledge Distillation +3

Cannot find the paper you are looking for? You can Submit a new open access paper.