Search Results for author: Qingyun Liu

Found 4 papers, 1 papers with code

Wisdom of Committee: Distilling from Foundation Model to Specialized Application Model

no code implementations21 Feb 2024 Zichang Liu, Qingyun Liu, Yuening Li, Liang Liu, Anshumali Shrivastava, Shuchao Bi, Lichan Hong, Ed H. Chi, Zhe Zhao

Further, to accommodate the dissimilarity among the teachers in the committee, we introduce DiverseDistill, which allows the student to understand the expertise of each teacher and extract task knowledge.

Knowledge Distillation Transfer Learning

LEVI: Generalizable Fine-tuning via Layer-wise Ensemble of Different Views

no code implementations7 Feb 2024 Yuji Roh, Qingyun Liu, Huan Gui, Zhe Yuan, Yujin Tang, Steven Euijong Whang, Liang Liu, Shuchao Bi, Lichan Hong, Ed H. Chi, Zhe Zhao

By combining two complementing models, LEVI effectively suppresses problematic features in both the fine-tuning data and pre-trained model and preserves useful features for new tasks.

Talking Models: Distill Pre-trained Knowledge to Downstream Models via Interactive Communication

no code implementations4 Oct 2023 Zhe Zhao, Qingyun Liu, Huan Gui, Bang An, Lichan Hong, Ed H. Chi

In this paper, we extend KD with an interactive communication process to help students of downstream tasks learn effectively from pre-trained foundation models.

Decoder Knowledge Distillation +1

Cannot find the paper you are looking for? You can Submit a new open access paper.