Search Results for author: Hyesu Lim

Found 4 papers, 2 papers with code

Towards Calibrated Robust Fine-Tuning of Vision-Language Models

no code implementations3 Nov 2023 Changdae Oh, Hyesu Lim, Mijoo Kim, Jaegul Choo, Alexander Hauptmann, Zhi-Qi Cheng, Kyungwoo Song

Robust fine-tuning aims to ensure performance on out-of-distribution (OOD) samples, which is sometimes compromised by pursuing adaptation on in-distribution (ID) samples.

Autonomous Driving Medical Diagnosis

TTN: A Domain-Shift Aware Batch Normalization in Test-Time Adaptation

no code implementations10 Feb 2023 Hyesu Lim, Byeonggeun Kim, Jaegul Choo, Sungha Choi

In this paper, we identify that CBN and TBN are in a trade-off relationship and present a new test-time normalization (TTN) method that interpolates the statistics by adjusting the importance between CBN and TBN according to the domain-shift sensitivity of each BN layer.

Test-time Adaptation

AVocaDo: Strategy for Adapting Vocabulary to Downstream Domain

1 code implementation EMNLP 2021 Jimin Hong, Taehee Kim, Hyesu Lim, Jaegul Choo

During the fine-tuning phase of transfer learning, the pretrained vocabulary remains unchanged, while model parameters are updated.

Language Modelling Transfer Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.