no code implementations • RepL4NLP (ACL) 2022 • Changwook Jun, Hansol Jang, Myoseop Sim, Hyun Kim, Jooyoung Choi, Kyungkoo Min, Kyunghoon Bae
Pre-trained language models have brought significant improvements in performance in a variety of natural language processing tasks.
1 code implementation • 22 Feb 2024 • Hanseok Oh, Hyunji Lee, Seonghyeon Ye, Haebin Shin, Hansol Jang, Changwook Jun, Minjoon Seo
Enhancing the capability of retrievers to understand intentions and preferences of users, akin to language model instructions, has the potential to yield more aligned search targets.
no code implementations • 28 Mar 2022 • Changwook Jun, Hansol Jang, Myoseop Sim, Hyun Kim, Jooyoung Choi, Kyungkoo Min, Kyunghoon Bae
Pre-trained language models have brought significant improvements in performance in a variety of natural language processing tasks.
1 code implementation • LREC 2022 • Changwook Jun, Jooyoung Choi, Myoseop Sim, Hyun Kim, Hansol Jang, Kyungkoo Min
Subsequently, we then build a pre-trained language model based on Transformer and fine-tune the model for table question answering with these datasets.
1 code implementation • 10 Aug 2020 • Sangah Lee, Hansol Jang, Yunmee Baik, Suzi Park, Hyopil Shin
Since the appearance of BERT, recent works including XLNet and RoBERTa utilize sentence embedding models pre-trained by large corpora and a large number of parameters.