1 code implementation • 24 Feb 2024 • Soyoung Yoon, Eunbi Choi, Jiyeon Kim, Yireun Kim, Hyeongu Yun, Seung-won Hwang
We propose ListT5, a novel reranking approach based on Fusion-in-Decoder (FiD) that handles multiple candidate passages at both train and inference time.
no code implementations • 24 May 2023 • Sunkyoung Kim, Dayeon Ki, Yireun Kim, Jinsik Lee
Existing cross-lingual transfer (CLT) prompting methods are only concerned with monolingual demonstration examples in the source language.
2 code implementations • 28 Feb 2023 • Seonghyeon Ye, Hyeonbin Hwang, Sohee Yang, Hyeongu Yun, Yireun Kim, Minjoon Seo
In this paper, we present our finding that prepending a Task-Agnostic Prefix Prompt (TAPP) to the input improves the instruction-following ability of various Large Language Models (LLMs) during inference.
1 code implementation • 6 Sep 2022 • Janghoon Han, Joongbo Shin, Hosung Song, Hyunjik Jo, Gyeonghun Kim, Yireun Kim, Stanley Jungkyu Choi
In the experiment, we investigate the effect of weighted negative sampling, post-training, and style transfer.