no code implementations • RepL4NLP (ACL) 2022 • Changwook Jun, Hansol Jang, Myoseop Sim, Hyun Kim, Jooyoung Choi, Kyungkoo Min, Kyunghoon Bae
Pre-trained language models have brought significant improvements in performance in a variety of natural language processing tasks.
no code implementations • 28 Mar 2022 • Changwook Jun, Hansol Jang, Myoseop Sim, Hyun Kim, Jooyoung Choi, Kyungkoo Min, Kyunghoon Bae
Pre-trained language models have brought significant improvements in performance in a variety of natural language processing tasks.
1 code implementation • LREC 2022 • Changwook Jun, Jooyoung Choi, Myoseop Sim, Hyun Kim, Hansol Jang, Kyungkoo Min
Subsequently, we then build a pre-trained language model based on Transformer and fine-tune the model for table question answering with these datasets.