Search Results for author: Wang Qi

Found 2 papers, 0 papers with code

Parameter-Efficient Tuning on Layer Normalization for Pre-trained Language Models

no code implementations16 Nov 2022 Wang Qi, Yu-Ping Ruan, Yuan Zuo, Taihao Li

Conventional fine-tuning encounters increasing difficulties given the size of current Pre-trained Language Models, which makes parameter-efficient tuning become the focal point of frontier research.

Cannot find the paper you are looking for? You can Submit a new open access paper.