Search Results for author: Kuang-Ming Chen

Found 2 papers, 1 papers with code

Chat Vector: A Simple Approach to Equip LLMs with Instruction Following and Model Alignment in New Languages

no code implementations7 Oct 2023 Shih-Cheng Huang, Pin-Zu Li, Yu-Chi Hsu, Kuang-Ming Chen, Yu Tung Lin, Shih-Kai Hsiao, Richard Tzong-Han Tsai, Hung-Yi Lee

By simply adding the chat vector to a continual pre-trained model's weights, we can endow the model with chat capabilities in new languages without the need for further training.

Instruction Following

Compressing Transformer-based self-supervised models for speech processing

1 code implementation17 Nov 2022 Tzu-Quan Lin, Tsung-Huan Yang, Chun-Yao Chang, Kuang-Ming Chen, Tzu-hsun Feng, Hung-Yi Lee, Hao Tang

Despite the success of Transformers in self- supervised learning with applications to various downstream tasks, the computational cost of training and inference remains a major challenge for applying these models to a wide spectrum of devices.

Knowledge Distillation Model Compression +1

Cannot find the paper you are looking for? You can Submit a new open access paper.