Search Results for author: Jian-Ping Mei

Found 5 papers, 3 papers with code

A Geometric Perspective on Diffusion Models

no code implementations31 May 2023 Defang Chen, Zhenyu Zhou, Jian-Ping Mei, Chunhua Shen, Chun Chen, Can Wang

Recent years have witnessed significant progress in developing effective training and fast sampling techniques for diffusion models.

Denoising

Knowledge Distillation with the Reused Teacher Classifier

1 code implementation CVPR 2022 Defang Chen, Jian-Ping Mei, Hailin Zhang, Can Wang, Yan Feng, Chun Chen

Knowledge distillation aims to compress a powerful yet cumbersome teacher model into a lightweight student model without much sacrifice of performance.

Knowledge Distillation

Cross-Layer Distillation with Semantic Calibration

2 code implementations6 Dec 2020 Defang Chen, Jian-Ping Mei, Yuan Zhang, Can Wang, Yan Feng, Chun Chen

Knowledge distillation is a technique to enhance the generalization ability of a student model by exploiting outputs from a teacher model.

Knowledge Distillation Transfer Learning

Online Knowledge Distillation with Diverse Peers

2 code implementations1 Dec 2019 Defang Chen, Jian-Ping Mei, Can Wang, Yan Feng, Chun Chen

The second-level distillation is performed to transfer the knowledge in the ensemble of auxiliary peers further to the group leader, i. e., the model used for inference.

Knowledge Distillation Transfer Learning

Classification and its applications for drug-target interaction identification

no code implementations16 Feb 2015 Jian-Ping Mei, Chee-Keong Kwoh, Peng Yang, Xiao-Li Li

Classification is one of the most popular and widely used supervised learning tasks, which categorizes objects into predefined classes based on known knowledge.

Classification Clustering +2

Cannot find the paper you are looking for? You can Submit a new open access paper.