1 code implementation • ACL 2022 • HongSeok Choi, Dongha Choi, Hyunju Lee
The proposed method is advantageous because it does not require a separate validation set and provides a better stopping point by using a large unlabeled set.
1 code implementation • ACL 2022 • Dongha Choi, HongSeok Choi, Hyunju Lee
In this study, we propose a domain knowledge transferring (DoKTra) framework for PLMs without additional in-domain pretraining.
no code implementations • 12 Nov 2021 • HongSeok Choi, Hyunju Lee
Finally, the proposed learning strategy is to train all samples with the good initialization parameters and stop the model with the early stopping techniques.
1 code implementation • SEMEVAL 2018 • HongSeok Choi, Hyunju Lee
A key idea for our system is full use of transfer learning from the Natural Language Inference (NLI) task to this task.