Search Results for author: Sejune Joo

Found 2 papers, 2 papers with code

Semiparametric Token-Sequence Co-Supervision

1 code implementation14 Mar 2024 Hyunji Lee, Doyoung Kim, Jihoon Jun, Sejune Joo, Joel Jang, Kyoung-Woon On, Minjoon Seo

Especially, the robustness of parametric token space which is established during the pretraining step tends to effectively enhance the stability of nonparametric sequence embedding space, a new space established by another language model.

Language Modelling

How Well Do Large Language Models Truly Ground?

1 code implementation15 Nov 2023 Hyunji Lee, Sejune Joo, Chaeeun Kim, Joel Jang, Doyoung Kim, Kyoung-Woon On, Minjoon Seo

Reliance on the inherent knowledge of Large Language Models (LLMs) can cause issues such as hallucinations, lack of control, and difficulties in integrating variable knowledge.

Cannot find the paper you are looking for? You can Submit a new open access paper.