Search Results for author: Mengya Gao

Found 7 papers, 2 papers with code

One to Transfer All: A Universal Transfer Framework for Vision Foundation Model with Few Data

no code implementations24 Nov 2021 Yujie Wang, Junqin Huang, Mengya Gao, Yichao Wu, Zhenfei Yin, Ding Liang, Junjie Yan

Transferring with few data in a general way to thousands of downstream tasks is becoming a trend of the foundation model's application.

INTERN: A New Learning Paradigm Towards General Vision

no code implementations16 Nov 2021 Jing Shao, Siyu Chen, Yangguang Li, Kun Wang, Zhenfei Yin, Yinan He, Jianing Teng, Qinghong Sun, Mengya Gao, Jihao Liu, Gengshi Huang, Guanglu Song, Yichao Wu, Yuming Huang, Fenggang Liu, Huan Peng, Shuo Qin, Chengyu Wang, Yujie Wang, Conghui He, Ding Liang, Yu Liu, Fengwei Yu, Junjie Yan, Dahua Lin, Xiaogang Wang, Yu Qiao

Enormous waves of technological innovations over the past several years, marked by the advances in AI technologies, are profoundly reshaping the industry and the society.

Fixing the Teacher-Student Knowledge Discrepancy in Distillation

no code implementations31 Mar 2021 Jiangfan Han, Mengya Gao, Yujie Wang, Quanquan Li, Hongsheng Li, Xiaogang Wang

To solve this problem, in this paper, we propose a novel student-dependent distillation method, knowledge consistent distillation, which makes teacher's knowledge more consistent with the student and provides the best suitable knowledge to different student networks for distillation.

Image Classification Knowledge Distillation +2

Residual Knowledge Distillation

no code implementations21 Feb 2020 Mengya Gao, Yujun Shen, Quanquan Li, Chen Change Loy

Knowledge distillation (KD) is one of the most potent ways for model compression.

Knowledge Distillation Model Compression

An Embarrassingly Simple Approach for Knowledge Distillation

1 code implementation5 Dec 2018 Mengya Gao, Yujun Shen, Quanquan Li, Junjie Yan, Liang Wan, Dahua Lin, Chen Change Loy, Xiaoou Tang

Knowledge Distillation (KD) aims at improving the performance of a low-capacity student model by inheriting knowledge from a high-capacity teacher model.

Face Recognition Knowledge Distillation +3

A Novel Hybrid Machine Learning Model for Auto-Classification of Retinal Diseases

1 code implementation17 Jun 2018 C. -H. Huck Yang, Jia-Hong Huang, Fangyu Liu, Fang-Yi Chiu, Mengya Gao, Weifeng Lyu, I-Hung Lin M. D., Jesper Tegner

Automatic clinical diagnosis of retinal diseases has emerged as a promising approach to facilitate discovery in areas with limited access to specialists.

BIG-bench Machine Learning General Classification

Cannot find the paper you are looking for? You can Submit a new open access paper.