Search Results for author: Gihun Lee

Found 10 papers, 6 papers with code

FedFN: Feature Normalization for Alleviating Data Heterogeneity Problem in Federated Learning

no code implementations22 Nov 2023 Seongyoon Kim, Gihun Lee, Jaehoon Oh, Se-Young Yun

Additionally, we observe that as data heterogeneity increases, the gap between higher feature norms for observed classes, obtained from local models, and feature norms of unobserved classes widens, in contrast to the behavior of classifier weight norms.

Federated Learning

Instructive Decoding: Instruction-Tuned Large Language Models are Self-Refiner from Noisy Instructions

1 code implementation1 Nov 2023 Taehyeon Kim, Joonkee Kim, Gihun Lee, Se-Young Yun

Notably, utilizing 'opposite' as the noisy instruction in ID, which exhibits the maximum divergence from the original instruction, consistently produces the most significant performance gains across multiple models and tasks.

Few-Shot NLI Instruction Following +2

FedSOL: Stabilized Orthogonal Learning with Proximal Restrictions in Federated Learning

no code implementations24 Aug 2023 Gihun Lee, Minchan Jeong, Sangmook Kim, Jaehoon Oh, Se-Young Yun

FedSOL is designed to identify gradients of local objectives that are inherently orthogonal to directions affecting the proximal objective.

Federated Learning

Self-Contrastive Learning

no code implementations29 Sep 2021 Sangmin Bae, Sungnyun Kim, Jongwoo Ko, Gihun Lee, Seungjong Noh, Se-Young Yun

This paper proposes a novel contrastive learning framework, called Self-Contrastive (SelfCon) Learning, that self-contrasts within multiple outputs from the different levels of a multi-exit network.

Contrastive Learning

Self-Contrastive Learning: Single-viewed Supervised Contrastive Framework using Sub-network

1 code implementation29 Jun 2021 Sangmin Bae, Sungnyun Kim, Jongwoo Ko, Gihun Lee, Seungjong Noh, Se-Young Yun

To this end, we propose Self-Contrastive (SelfCon) learning, which self-contrasts within multiple outputs from the different levels of a single network.

Contrastive Learning

Preservation of the Global Knowledge by Not-True Distillation in Federated Learning

2 code implementations6 Jun 2021 Gihun Lee, Minchan Jeong, Yongjin Shin, Sangmin Bae, Se-Young Yun

In federated learning, a strong global model is collaboratively learned by aggregating clients' locally trained models.

Continual Learning Federated Learning +1

MixCo: Mix-up Contrastive Learning for Visual Representation

1 code implementation13 Oct 2020 Sungnyun Kim, Gihun Lee, Sangmin Bae, Se-Young Yun

Contrastive learning has shown remarkable results in recent self-supervised approaches for visual representation.

Contrastive Learning Self-Supervised Learning

SIPA: A Simple Framework for Efficient Networks

1 code implementation24 Apr 2020 Gihun Lee, Sangmin Bae, Jaehoon Oh, Se-Young Yun

With the success of deep learning in various fields and the advent of numerous Internet of Things (IoT) devices, it is essential to lighten models suitable for low-power devices.

Math

Cannot find the paper you are looking for? You can Submit a new open access paper.