no code implementations • 24 May 2022 • Chi-Liang Liu, Hung-Yi Lee, Wen-tau Yih
We propose structured prompt tuning, a simple and effective method to improve prompt tuning.
1 code implementation • 7 May 2021 • Yi-Chen Chen, Po-Han Chi, Shu-wen Yang, Kai-Wei Chang, Jheng-Hao Lin, Sung-Feng Huang, Da-Rong Liu, Chi-Liang Liu, Cheng-Kuang Lee, Hung-Yi Lee
The multi-task learning of a wide variety of speech processing tasks with a universal model has not been studied.
3 code implementations • EMNLP (MRQA) 2021 • Chi-Liang Liu, Hung-Yi Lee
In this paper, we study the possibility of almost unsupervised Multiple Choices Question Answering (MCQA).
1 code implementation • 20 Oct 2020 • Chi-Liang Liu, Tsung-Yuan Hsu, Yung-Sung Chuang, Chung-Yi Li, Hung-Yi Lee
Token embeddings in multilingual BERT (m-BERT) contain both language and semantic information.
no code implementations • 20 Oct 2020 • Chi-Liang Liu, Tsung-Yuan Hsu, Yung-Sung Chuang, Hung-Yi Lee
Recently, multilingual BERT works remarkably well on cross-lingual transfer tasks, superior to static non-contextualized word embeddings.
no code implementations • 20 Apr 2020 • Chi-Liang Liu, Tsung-Yuan Hsu, Yung-Sung Chuang, Hung-Yi Lee
Recently, multilingual BERT works remarkably well on cross-lingual transfer tasks, superior to static non-contextualized word embeddings.
no code implementations • 25 Oct 2019 • Yung-Sung Chuang, Chi-Liang Liu, Hung-Yi Lee, Lin-shan Lee
In addition to the potential of end-to-end SQA, the SpeechBERT can also be considered for many other spoken language understanding tasks just as BERT for many text processing tasks.
Ranked #3 on Spoken Language Understanding on Spoken-SQuAD
no code implementations • 25 Sep 2019 • Chun-Hsing Lin, Alvin Chiang, Chi-Liang Liu, Chien-Fu Lin, Po-Hsien Chu, Siang-Ruei Wu, Yi-En Tsai, Chung-Yang (Ric) Huang
Score-function-based text generation approaches such as REINFORCE, in general, suffer from high computational complexity and training instability problems.
no code implementations • IJCNLP 2019 • Tsung-Yuan Hsu, Chi-Liang Liu, Hung-Yi Lee
Because it is not feasible to collect training data for every language, there is a growing interest in cross-lingual transfer learning.
1 code implementation • 1 Apr 2018 • Chia-Hsuan Li, Szu-Lin Wu, Chi-Liang Liu, Hung-Yi Lee
Reading comprehension has been widely studied.
Ranked #4 on Spoken Language Understanding on Spoken-SQuAD