Search Results for author: Zhiliang Gan

Found 1 papers, 0 papers with code

KDLSQ-BERT: A Quantized Bert Combining Knowledge Distillation with Learned Step Size Quantization

no code implementations15 Jan 2021 Jing Jin, Cai Liang, Tiancheng Wu, Liqin Zou, Zhiliang Gan

The main idea of our method is that the KD technique is leveraged to transfer the knowledge from a "teacher" model to a "student" model when exploiting LSQ to quantize that "student" model during the quantization training process.

Knowledge Distillation Language Modelling +1

Cannot find the paper you are looking for? You can Submit a new open access paper.