no code implementations • 7 May 2022 • Zhengbo Zhang, Chunluan Zhou, Zhigang Tu
Knowledge distillation is widely adopted in semantic segmentation to reduce the computation cost. The previous knowledge distillation methods for semantic segmentation focus on pixel-wise feature alignment and intra-class feature variation distillation, neglecting to transfer the knowledge of the inter-class distance in the feature space, which is important for semantic segmentation.
no code implementations • 28 Jan 2020 • Xiaoli Liu, Pan Hu, Zhi Mao, Po-Chih Kuo, Peiyao Li, Chao Liu, Jie Hu, Deyu Li, Desen Cao, Roger G. Mark, Leo Anthony Celi, Zhengbo Zhang, Feihu Zhou
This study aims to develop an interpretable and generalizable model for early mortality prediction in elderly patients with MODS.