2 code implementations • 2 May 2022 • HeeSun Bae, Seungjae Shin, Byeonghu Na, JoonHo Jang, Kyungwoo Song, Il-Chul Moon
We suggest a new branch of method, Noisy Prediction Calibration (NPC) in learning with noisy labels.
1 code implementation • 8 Mar 2023 • Seungjae Shin, HeeSun Bae, DongHyeok Shin, Weonyoung Joo, Il-Chul Moon
Training neural networks on a large dataset requires substantial computational costs.
1 code implementation • 9 Jan 2024 • Youngjae Cho, HeeSun Bae, Seungjae Shin, Yeo Dong Youn, Weonyoung Joo, Il-Chul Moon
This paper presents a Bayesian-based framework of prompt learning, which could alleviate the overfitting issues on few-shot learning application and increase the adaptability of prompts on unseen instances.
1 code implementation • 5 Mar 2024 • HeeSun Bae, Seungjae Shin, Byeonghu Na, Il-Chul Moon
We propose good utilization of the transition matrix is crucial and suggest a new utilization method based on resampling, coined RENT.
1 code implementation • 27 Feb 2024 • Byeonghu Na, Yeongmin Kim, HeeSun Bae, Jung Hyun Lee, Se Jung Kwon, Wanmo Kang, Il-Chul Moon
This paper proposes Transition-aware weighted Denoising Score Matching (TDSM) for training conditional diffusion models with noisy labels, which is the first study in the line of diffusion models.
no code implementations • 12 Mar 2024 • Seungjae Shin, HeeSun Bae, Byeonghu Na, Yoon-Yeong Kim, Il-Chul Moon
In particular, by aligning the loss landscape acquired in the source domain to the loss landscape of perturbed domains, we expect to achieve generalization grounded on these flat minima for the unknown domains.