Search Results for author: Gihun Lee

Found 13 papers, 7 papers with code

Learning to Summarize from LLM-generated Feedback

no code implementations17 Oct 2024 Hwanjun Song, Taewon Yun, Yuho Lee, Jihwan Oh, Gihun Lee, Jason Cai, Hang Su

Developing effective text summarizers remains a challenge due to issues like hallucinations, key information omissions, and verbosity in LLM-generated summaries.

CUPID: A Real-Time Session-Based Reciprocal Recommendation System for a One-on-One Social Discovery Platform

no code implementations8 Oct 2024 Beomsu Kim, SangBum Kim, Minchan Kim, Joonyoung Yi, Sungjoo Ha, Suhyun Lee, Youngsoo Lee, Gihun Yeom, Buru Chang, Gihun Lee

However, conventional session-based approaches struggle with high latency due to the demands of modeling sequential user behavior for each recommendation process.

Recommendation Systems

BAPO: Base-Anchored Preference Optimization for Overcoming Forgetting in Large Language Models Personalization

no code implementations30 Jun 2024 Gihun Lee, Minchan Jeong, Yujin Kim, Hojung Jung, Jaehoon Oh, Sangmook Kim, Se-Young Yun

To this end, we introduce Base-Anchored Preference Optimization (BAPO), a simple yet effective approach that utilizes the initial responses of reference model to mitigate forgetting while accommodating personalized alignment.

Continual Learning General Knowledge +3

FedFN: Feature Normalization for Alleviating Data Heterogeneity Problem in Federated Learning

no code implementations22 Nov 2023 Seongyoon Kim, Gihun Lee, Jaehoon Oh, Se-Young Yun

Additionally, we observe that as data heterogeneity increases, the gap between higher feature norms for observed classes, obtained from local models, and feature norms of unobserved classes widens, in contrast to the behavior of classifier weight norms.

Federated Learning

Instructive Decoding: Instruction-Tuned Large Language Models are Self-Refiner from Noisy Instructions

1 code implementation1 Nov 2023 Taehyeon Kim, Joonkee Kim, Gihun Lee, Se-Young Yun

Notably, utilizing 'opposite' as the noisy instruction in ID, which exhibits the maximum divergence from the original instruction, consistently produces the most significant performance gains across multiple models and tasks.

Few-Shot NLI Instruction Following +2

FedSOL: Stabilized Orthogonal Learning with Proximal Restrictions in Federated Learning

2 code implementations CVPR 2024 Gihun Lee, Minchan Jeong, Sangmook Kim, Jaehoon Oh, Se-Young Yun

FedSOL is designed to identify gradients of local objectives that are inherently orthogonal to directions affecting the proximal objective.

Continual Learning Federated Learning +3

Self-Contrastive Learning

no code implementations29 Sep 2021 Sangmin Bae, Sungnyun Kim, Jongwoo Ko, Gihun Lee, Seungjong Noh, Se-Young Yun

This paper proposes a novel contrastive learning framework, called Self-Contrastive (SelfCon) Learning, that self-contrasts within multiple outputs from the different levels of a multi-exit network.

Contrastive Learning

Self-Contrastive Learning: Single-viewed Supervised Contrastive Framework using Sub-network

2 code implementations29 Jun 2021 Sangmin Bae, Sungnyun Kim, Jongwoo Ko, Gihun Lee, Seungjong Noh, Se-Young Yun

To this end, we propose Self-Contrastive (SelfCon) learning, which self-contrasts within multiple outputs from the different levels of a single network.

Contrastive Learning

Preservation of the Global Knowledge by Not-True Distillation in Federated Learning

3 code implementations6 Jun 2021 Gihun Lee, Minchan Jeong, Yongjin Shin, Sangmin Bae, Se-Young Yun

In federated learning, a strong global model is collaboratively learned by aggregating clients' locally trained models.

Continual Learning Federated Learning +1

MixCo: Mix-up Contrastive Learning for Visual Representation

1 code implementation13 Oct 2020 Sungnyun Kim, Gihun Lee, Sangmin Bae, Se-Young Yun

Contrastive learning has shown remarkable results in recent self-supervised approaches for visual representation.

Contrastive Learning Linear evaluation +1

SIPA: A Simple Framework for Efficient Networks

1 code implementation24 Apr 2020 Gihun Lee, Sangmin Bae, Jaehoon Oh, Se-Young Yun

With the success of deep learning in various fields and the advent of numerous Internet of Things (IoT) devices, it is essential to lighten models suitable for low-power devices.

Math

Cannot find the paper you are looking for? You can Submit a new open access paper.