no code implementations • 24 Mar 2024 • Minchan Kim, Minyeong Kim, Junik Bae, Suhwan Choi, Sungkyung Kim, Buru Chang
Subsequently, ESREAL computes token-level hallucination scores by assessing the semantic similarity of aligned regions based on the type of hallucination.
no code implementations • 3 Feb 2023 • Byounggyu Lew, Donghyun Son, Buru Chang
Although the task-specific knowledge could be learned from source domains by fine-tuning, this hurts the generalization power of pre-trained models due to gradient bias toward the source domains.
1 code implementation • ICCV 2023 • Seong Min Kye, Kwanghee Choi, Hyeongmin Byun, Buru Chang
Active learning (AL) aims to select the most useful data samples from an unlabeled data pool and annotate them to expand the labeled dataset under a limited budget.
1 code implementation • 11 Oct 2022 • Seungju Han, Beomsu Kim, Buru Chang
In this paper, we introduce a new automatic evaluation metric to measure the semantic diversity of generated responses.
1 code implementation • 16 Aug 2022 • Donghyun Son, Byounggyu Lew, Kwanghee Choi, Yongsu Baek, Seungwoo Choi, Beomjun Shin, Sungjoo Ha, Buru Chang
In this study, we formulate real-world scenarios of content moderation and introduce a simple yet effective threshold optimization method that searches the optimal thresholds of the multiple subtasks to make a reliable moderation decision in a cost-effective way.
1 code implementation • NAACL 2022 • Seungju Han, Beomsu Kim, Jin Yong Yoo, Seokjun Seo, SangBum Kim, Enkhbayar Erdenee, Buru Chang
To better reflect the style of the character, PDP builds the prompts in the form of dialog that includes the character's utterances as dialog history.
1 code implementation • NLP4ConvAI (ACL) 2022 • Seungju Han, Beomsu Kim, Seokjun Seo, Enkhbayar Erdenee, Buru Chang
Extensive experiments demonstrate that our proposed training method alleviates the drawbacks of the existing exemplar-based generative models and significantly improves the performance in terms of appropriateness and informativeness.
1 code implementation • 29 Nov 2021 • Seong Min Kye, Kwanghee Choi, Joonyoung Yi, Buru Chang
Recent studies on learning with noisy labels have shown remarkable performance by exploiting a small clean dataset.
no code implementations • 27 Oct 2021 • Kwanghee Choi, Martin Kersner, Jacob Morton, Buru Chang
Improving the performance of on-device audio classification models remains a challenge given the computational limits of the mobile environment.
Ranked #7 on Audio Classification on FSD50K
1 code implementation • Findings (EMNLP) 2021 • Beomsu Kim, Seokjun Seo, Seungju Han, Enkhbayar Erdenee, Buru Chang
G2R consists of two distinct techniques of distillation: the data-level G2R augments the dialogue dataset with additional responses generated by the large-scale generative model, and the model-level G2R transfers the response quality score assessed by the generative model to the score of the retrieval model by the knowledge distillation loss.
no code implementations • 15 Apr 2021 • Joonyoung Yi, Buru Chang
Despite the rapid growth of online advertisement in developing countries, existing highly over-parameterized Click-Through Rate (CTR) prediction models are difficult to be deployed due to the limited computing resources.
no code implementations • EACL 2021 • Buru Chang, Inggeol Lee, Hyunjae Kim, Jaewoo Kang
Several machine learning-based spoiler detection models have been proposed recently to protect users from spoilers on review websites.
no code implementations • 15 Jan 2021 • Buru Chang, Inggeol Lee, Hyunjae Kim, Jaewoo Kang
Several machine learning-based spoiler detection models have been proposed recently to protect users from spoilers on review websites.
2 code implementations • CVPR 2021 • Youngkyu Hong, Seungju Han, Kwanghee Choi, Seokjun Seo, Beomsu Kim, Buru Chang
Although this method surpasses state-of-the-art methods on benchmark datasets, it can be further improved by directly disentangling the source label distribution from the model prediction in the training phase.
Ranked #20 on Long-tail Learning on Places-LT